Deep understanding: how code structure affects the contents of date arrays created using loops

Background Description

I asked a question about defining an array of dates using a loop.

The array is determined based on the declared variable named "dateinterval". The way I developed the code caused an error message regarding another loop, and another user provided me with another loop that solved this.

Now that I’ve carefully studied two different solutions, I just don’t understand why they do not give the same result.

My code

I developed the code below to define an array of dates in UTC. However, the result is an array of dates in milliseconds from January 1, 1970, 00:00:00. That is, in other words, a number.

for (var i=0; i < difference; i++){ dateinterval[dateinterval.length] = dateinterval[0].setDate(datointerval[0].getDate() + i); }; 

Correct solution

Below is the correct code provided by another user (thanks again!) This code defines an array of UTC dates.

 for (var i = 0; i < difference; i++) { var dt = new Date(dateinterval[0]); dt.setDate(dt.getDate() + i); dateinterval[dateinterval.length] = dt; }; 

What? I do not understand

I almost stopped looking at two different solutions to find out what the difference is, and I just don't get it.

In my unprepared eyes, it seems that the two parts of the code perform exactly the same operation and that the only difference is how they are structured. I was told that setDate returns milliseconds, and in my code these milliseconds were assigned to an array. But in the right solution, the DT variable is also assigned the value of setDate , which, as I understand it, should also be in milliseconds. So why the line:

 dateinterval[dateinterval.length] = dt; 

not assign milliseconds to a dateinterval array?

Can someone explain this to me so that I can better understand Javascript and not just replicate working solutions?

+6
source share
1 answer

When you do:

 dateinterval[dateinterval.length] = dateinterval[0].setDate(datointerval[0].getDate() + i); 

you assign a return value from dateinterval[0].setDate(…) to dateinterval[…] . This is the return value of timeclip or the internal temporary value of a Date object (in milliseconds since January 1, 1970). See ECMA-262 §20.3.4.20 .

So you need to change the date first:

 dateinterval[0].setDate(datointerval[0].getDate() + i); 

then assign an object reference:

 dateinterval[dateinterval.length] = dateinterval[0]; 

Edit

This may help to see a simple case.

 // Create a new Date for 2015-01-01 var date = new Date(2015,0,1); // Change the date to 2015-01-02 var x = date.setDate(2); // The return value from the method is the internal timevalue console.log(x); // Check the timevalue console.log(new Date(x)); // 2 January, 2015 

It seems that the OP expects setDate to return the original Date object, but it does not, it returns the value of timevalue. The original function will work if the return value from setDate is converted to Date:

 dateinterval[dateinterval.length] = new Date(dateinterval[0].setDate(datointerval[0].getDate() + i)); 

however, it throws away the original Date object and creates a new one.

+3
source

Source: https://habr.com/ru/post/986127/


All Articles