How does JavaScript determine if an array index is integer?

It seems that the indices of the JavaScript array are actually integers, so a[0] matches a['0'] , and a[1.0] not a[1] , but a['1.0'] . But at the same time, the array has the length property; it will be updated automatically when the value of integer keys changes. So, how does JavaScript know that the key is an integer and needs to change the length? If I do this:

 var a = 4/2; var b=8/4; var c = 2; var d= 1*2; 

is arr[2], arr[0+2], arr[1*2], arr[a], arr[b], arr[c], arr[d] the same?

We often access the array in a loop as follows:

 for (i=0; i<100; i++) { arr[i]=1; // this is a[0],a[1] right? arr[i+0.0]=1; // is this a[0] or a['0.0'] ? } 

If I write this:

 for (i=0.1; i<100; i+=0.1) { arr[i*10]=1; // what does it do? a[1] = 1, a[1.0]=1 or a[1.00000] = 1 ? } 

what does the assignment do in a loop?

+4
source share
3 answers

JavaScript matrices are not really arrays, they are JavaScript objects that have prototypes that make them act like arrays. arr['one'] = 1 valid JavaScript.

The way arr.length works is to just look at the keys of the array, find the largest number (JavaScript doesn't actually do integers, just floats) and returns that number + 1.

to try:

 var arr = []; arr.one = 1; arr[8] = 1; console.log(arr.length); 
+2
source

For starters, in JavaScript (ES5), there is no such thing as an integer. JavaScript (ES5) has only numbers.

Secondly, there are many implicit type changes in JavaScript. Here is an example:

 if(1=='1') console.log('very truthy'); 

If you use double-equals, it will discard โ€œ1โ€ by Number, and then compare the new value (1) with 1 (which will be true, 1 == 1), and then write the line โ€œvery truthfullyโ€.

If you use triple equalities, implicit casting will not happen.

 if(1==='1') console.log("this won't get logged"); 

The use of triple equivalents prevents accidental casting.

Then, when you add a value to the integer index of the array, that index is updated with the value you tell it about, AND THE LENGTH will be updated.

 var a = []; a[0] = 0 a[1] = 1; a[2.0] = 2; //[undefined, 1, 2] 

When trying to update an index that is not an integer (1.1), it converts everything that is into a string (1.1 becomes "1.1"), and then adds a new property to the array and sets a value on it. Custom properties of the array will not affect its length.

 var a = []; a[1.1] = 1.1; a.prop = "property"; //[], empty array console.log(a.prop, a['1.1']); //"property",1.1 

When you add a custom property to a JS array, it mutates the object to then act as an object literal.

So, in your case, here you get the mashup object array-ish / object-literal-ish. NOTE. If you add a custom property to JS Number or String, they are NOT converted to Object Literals. The behavior you are studying is unique to JS arrays.

+4
source

Due to playing in Chrome dev tools, I think your statements are not entirely correct. I found that:

 arr[0] == arr[0.0] == ar[0.0000] == arr["0"] 

BUT

 arr[0] != arr["0.0"] arr[0.0] != arr["0.0"] 

So it seems that what the array is doing inside calls .toString() at the index you give it, and using that as the actual index. Note:

 (0).toString() == "0" (0.0000).toString() == "0" 
+1
source

Source: https://habr.com/ru/post/1443156/


All Articles