Issue with java.sql.Time + GWT + browsers in Linux and Windows.

Yesterday, I got into a problem with my work.  On a Linux machine, I have a database (MySQL) table and there is a column in type time.  I use the following java code to retrieve the value:

Time t = rs.getTime("columnName"); // rs is a ResultSet.

The Time objec is sent via HTTP to a GWT web application.  The Time object is displayed correctly on a Firefox on linux (00:00:00) but not on a Firefox or IE on Windows (00:30:00).

No matter what value is in the database, the browsers on Windows will display it with an additional 30 mins.

My search starts with Java, JVM, tzdata.  Test, trail and error for more than 1 day, finally, the error is the javascript engine.  To understand the whole story, I must start with Java.

Base on the javadoc, java.sql.Time is actually a Date object.  The date related fields should be set to 1970-Jan-01 and left untouced.  Only the time part of it is (should be) used.  When the value is read from database, a Time object is created.  It is set to 1970-Jan-01 (Asia/Kuala Lumur).  Base on the tzdata, the offset for “Asia/Kuala Lumpur” on 1970-Jan-01 is only +7:30 instead of +8:00.  Thus, the Time object will have the value -27000000 (which means 00:00:00 +0:00 on 1970-Jan-01).

When this data is sent to the GWT application, the javascript engine in a Linux browser, correctly use the +7:30 offset. Thus, -27000000 plus +7:30 is 00:00:00.  But on both Windows browsers, the offset used is +8:00.  Thus, -27000000 plus +8:00 is 00:30:00.

To prove this: create a HTML file with the following content and open it with Firefox or epiphany on Linux, then do the same with IE or Firefox on Windows.  Maybe another reason to use Linux instead of Windows:

<html>
<body>
    <script type="text/javascript">
        var d = new Date();
        document.write("Current Date offset (min) = ");
        document.write(d.getTimezoneOffset());
        document.write("<br>");
        d.setFullYear(1970,1,1);
        document.write("1970-Jan-01 offset (min) = ");
        document.write(d.getTimezoneOffset());
    </script>
</body>
</html>

7 responses to “Issue with java.sql.Time + GWT + browsers in Linux and Windows.

  1. Lim How Wei November 11, 2009 at 2:59 am

    just my 2cents, but the only difference to me is the different timezone settings on Linux and Windows OS. in Windows, there is no such thing as +7:30GMT, and it’s publicly recognized Malaysia is in +8GMT zone…

    the best way for me is to define a custom timezone object, so you may ensure the date is always consistent.

    maybe you can try this in Java instead:

    Timezone tz = Timezone.getDefault(); // local machine default timezone data
    String[] tzs = TimeZone.getAvailableIDs(); // local machine available timezone ids
    Timezone tzKL = TimeZone.getTimeZone(“Asia/Kuala_Lumpur”); // local machine timezone data for KL

    i guess even for same timezone, Java will give you different answer for different OS.

    • wenlong November 11, 2009 at 10:04 am

      Hi Pal,

        I don’t need to have the timezone object. What I need is only the time, but java bundle it with timezone.
        Timezone offset for Asia/Kuala_Lumpur is +8:00GMT since 1981 (Dec 31 23:30:00 to be exact). Before that, it was +7:30GMT, +7:00GMT, etc. See http://en.wikipedia.org/wiki/Malaysian_Standard_Time for a complete list of timezone history for Malaysia.
        Before writing this blog post, I’ve also tested a simple Java program that return the timezone offset for Asia/Kuala_Lumpur on the date 1970-Jan-1st on both Linux and Windows. Both return me the same result: +7:30GMT. I believe the tzdata on both OSes are the same or have the same timezone history for Asia/Kuala_Lumpur.

      The problem I encountered was mainly cause by the Javascript Engine and the design of java.sql.Time class. Let me explain it in more detail:

        Javascript engine in Linux understand that, the timezone offset for Asia/Kuala_Lumpur on 1970-Jan-1st is +7:30GMT but in Windows, +8:00GMT is used. Why and what exactly is causing the difference? I do not know.
        java.sql.Time is a class to represent a database type called time. It basically means, HH:MM:SS without any date info. Meaning just a time, that’s all. Instead of creating a new class just for that purpose, Java decide to inherit from java.util.Date. The javadoc for java.sql.Time says clearly that, the date portion of the class should be set to 1970-Jan-1st and should not be accessed.

      Thus the following:

        Database value is 00:00:00
        java.sql.Time value is date=1970-Jan-1st, timezone=Asia/Kuala_Lumpur, millisec from epoch=-7.5*60*60*1000 (see bottom on how to calculate this value)
        Linux javascript engine translate the value to “-7.5*60*60*1000” + “+7:30GMT” = 00:00:00 (correct)
        Windows javascript engine translate the value to “-7.5*60*60*1000” + “+8:00GMT” = 00:30:00 (wrong)

      Calculation of milisec from Epoch:
      In the example above, the java.sql.Time has the value of 1970-Jan-1st, Asia/Kuala_Lumpur, 00:00:00 local time. Millisec from Epoch means the time on the same day but with the timzone of Greenwich. Thus, the above local date time will be translated to

        1969-Dec-31st, Greenwich, 16:30:00 (human term)
        1970-Jan-1st, Greenwich, -07:30:00 (computer term)
        -7.5*60*60*1000=-27000000 milisec from Epoch (take note of the minus sign)

      I cannot decide who is at fault here: The design of the java.sql.Time? or the Javascript engine?

  2. Lim How Wei November 11, 2009 at 11:57 am

    something weird here, your Windows machine seems to have the +7:30GMT, while mine don’t have it…

    i believe the javascript engine will just make use of the local machine timezone data… i’m not too sure how Linux Javascript engine can be different between OS since it should be tied up with browser. any readings which you found out the javascript engine between Linux FF and Windows FF are different?

    • wenlong November 11, 2009 at 4:28 pm

      I believe your Windows machine will give you +7:30GMT also.

      In Windows:
      Java Apps returns +7:30GMT on 1970-Jan-1st
      Javascript (Firefox & IE) returns +8:00GMT on 1970-Jan-1st

      In Linux:
      Java Apps returns +7:30GMT on 1970-Jan-1st
      Javascript (Firefox & epiphany) returns +7:30GMT on 1970-Jan-1st

      • Lim How Wei November 11, 2009 at 5:41 pm

        Java Apps follows Epoch and handled the case you mentioned (the timezone switch), and Linux handled @ OS level also. unfortunately, Windows is not handling this.

        Javascript engine should follow what OS can give…

        so your DB really holding data with very old date?

        anyway, in MySQL, Time object is not really a Time teller… should use timestamp or datetime object instead.

  3. wenlong November 11, 2009 at 9:15 pm

    For my case, what I need is only the HH:MM:SS, thus “time” is the correct data type to use. Timestamp or datetime will include the yyyy-mm-dd, that’s why I didn’t use them.

    • Lim How Wei November 12, 2009 at 2:02 am

      hmm… i just wonder if you use datetime, this problem will still occur??? of course if you have a date like 1970-1-1 Windows will still have problem😛

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: