Recently we discovered a weird behavior in one of our applications. A control was showing a different value than it had been showing for years under the same circumstances.
The above control was supposed to show two blanks and 10 minutes, but instead it shows 23 hours and 29 minutes.
This was a piece of legacy code that hadn’t been touched for at least 3 years.
In our database (SQL Server) we have a DateTime field to store a duration. This field is filled by creating a new SqlDateTime object and then adding the required amount of time to the object.
DateTime dateTime = (DateTime)new SqlDateTime();
dateTime = dateTime.AddMinutes(10);
This stores the value in the database as “1900–01–01 00:10:00.000”.
When we want to display the current value in the control, we retrieve it and convert it to JSON.
string jsonOutput = js.Serialize(dateTime);
The value in the jsonOutput string is “/Date(-2208991800000)”.
Wow that’s a nice number, what the heck does it mean?
It’s the number of milliseconds since January 1 1970; the start of the Unix epoch. And because the number is negative, it actually represents the milliseconds before the Unix time.
If you’re reading this in Chrome, press F12 to open the Console and type this:
If you’re on Chrome 67 and in Western Europe you’ll likely see something like this:
Sun Dec 31 1899 23:29:32 GMT+0019 (Midden-Europese standaardtijd)
What the … that’s not what we put it. What’s going on? (At least we know where the 23 hours and 29 minutes originates from.)
Well, it turns out back in 1900 we had a different time zone offset than in 1970 (and today) and Chrome has implemented a change that returns the time zone offset for the actual date in the Date object.
We know that in january 1970 our timezone offset was GMT+0100 and we see that in 1900 it was GMT+0019, so surely if we add 41 minutes we get the correct time, right? Well, not exactly …
var unixTime = new Date(0);
var jsonTime = new Date(-2208991800000);
var correction = (jsonTime.getTimezoneOffset() — unixTime.getTimezoneOffset()) * 60000;new Date(jsonTime.getTime() + correction);
Copy/Paste that in your Chrome console and you’ll see this output:
Mon Jan 01 1900 00:10:32 GMT+0019 (Midden-Europese standaardtijd)
It’s close, but it’s not correct. The timezone offset that’s returned is in minutes (hence the multiplication with 60000 in the correction variable). But apparently it wasn’t exactly 19 minutes.
Luckily there’s the toUTCString() function. Since we’re in the GMT+1 timezone, we need to add an hour before casting it to a UTC string.
var jsonTime = new Date(-2208991800000);
var correctedJsonTime = new Date(jsonTime.getTime() — (-1 * 60 * 60 * 1000));correctedJsonTime.toUTCString();
The output is:
“Mon, 01 Jan 1900 00:10:00 GMT”
But now it’s a string and not a Date object. So we can’t use functions like getFullYear() so our code breaks. And if we cast this back to a Date, what do we get?
new Date(“Mon, 01 Jan 1900 00:10:00 GMT”);
Mon Jan 01 1900 00:29:32 GMT+0019 (Midden-Europese standaardtijd)
We really didn’t want to go into string parsing to get time parts. And you can’t just add an hour to any date passed into the function, so some logic would have to be added. Not to mention that for now browsers like IE, Safari and FireFox have not yet implemented this change, resulting in even more logic.
Perhaps it would have been possible to customize the Serialize() function to return a string, but we opted to use Newtonsoft’s JsonConvert library instead, because it’s .SerializeObject() method serializes a date as a string by default.
new Date(“1900–01–01T00:10:00”);Mon Jan 01 1900 00:10:00 GMT+0019 (Midden-Europese standaardtijd)
Excellent. We now have a Date object with the same date and time as we’ve stored in our database.
And that’s it, bug fixed, case closed. I hope this story will save others some time with this or similar issues.
As a final remark I’d like to say that saving a duration as a DateTime data type is probably not the best approach. I’d recommend to store a duration as an integer if you need second or minute precision or a long (BIGINT) to store Ticks if you need more precision.
The smallest unit of time is the tick, which is equal to 100 nanoseconds or one ten-millionth of a second. There are 10,000 ticks in a millisecond. The value of the Ticks property can be negative or positive to represent a negative or positive time interval.