Why do Date.parse ('2012-01-01') and Date.parse ('1/1/2012') return different values?

Tested in browsers (Firefox and Chrome) and on different platforms (OSX and Linux):

> Date.parse('2012-01-01') 1325376000000 > Date.parse('1/1/2012') 1325394000000 

Relevant: https://github.com/portablemind/compass_agile_enterprise/wiki/Javascript-Date.parse-bug%3F

+4
source share
3 answers

The format 2012-01-01 is interpreted as conforming to ISO 8601, and the time zone is Z (+00, Universal Time Coordinated). Format 1/1/2012, if accepted (it depends on the implementation), is taken as local time.

For more consistent results, use a library such as Globalize.js.

+3
source

If you add Z to the end, this will ensure that you always mean UTC.

 > Date.parse('2012-01-01') 1325376000000 > Date.parse('1/1/2012') 1325394000000 > Date.parse('1/1/2012 Z') 1325376000000 
+1
source

I wrote the code as below:

  var a = Date.parse('2012-01-01'); var b = Date.parse('2012-01-01'); var c = Date.parse('1/1/2012'); alert( a + ' - ' + b + ' - ' + c ); 

And the result:

1325376000000 - 1325376000000 - 1325376000000

The reason I wrote the same code for a and b, http://www.w3schools.com/jsref/jsref_parse.asp says Date.parse returns the value of a millisecond, regardless of whether time passes between these lines.

Im using Firefox 9.0.1 and the result is correct.

0
source

Source: https://habr.com/ru/post/1388784/


All Articles