Background Description
I asked a question about defining an array of dates using a loop.
The array is determined based on the declared variable named "dateinterval". The way I developed the code caused an error message regarding another loop, and another user provided me with another loop that solved this.
Now that I’ve carefully studied two different solutions, I just don’t understand why they do not give the same result.
My code
I developed the code below to define an array of dates in UTC. However, the result is an array of dates in milliseconds from January 1, 1970, 00:00:00. That is, in other words, a number.
for (var i=0; i < difference; i++){ dateinterval[dateinterval.length] = dateinterval[0].setDate(datointerval[0].getDate() + i); };
Correct solution
Below is the correct code provided by another user (thanks again!) This code defines an array of UTC dates.
for (var i = 0; i < difference; i++) { var dt = new Date(dateinterval[0]); dt.setDate(dt.getDate() + i); dateinterval[dateinterval.length] = dt; };
What? I do not understand
I almost stopped looking at two different solutions to find out what the difference is, and I just don't get it.
In my unprepared eyes, it seems that the two parts of the code perform exactly the same operation and that the only difference is how they are structured. I was told that setDate returns milliseconds, and in my code these milliseconds were assigned to an array. But in the right solution, the DT variable is also assigned the value of setDate , which, as I understand it, should also be in milliseconds. So why the line:
dateinterval[dateinterval.length] = dt;
not assign milliseconds to a dateinterval array?
Can someone explain this to me so that I can better understand Javascript and not just replicate working solutions?