arr = Array(10).fill(false)
arr[-2] = true
console.log(arr)
[false, false, false, false, false, false, false, false, false, false, -2: true]
console.log(arr.length) // 10
I'm so surprised that adding an element in negative index of the array, adds a key-value pair in the array. And also, the length of the array is not incremented?
arr = Array(10).fill(false)
arr[-2] = true
console.log(arr)
[false, false, false, false, false, false, false, false, false, false, -2: true]
console.log(arr.length) // 10
I'm so surprised that adding an element in negative index of the array, adds a key-value pair in the array. And also, the length of the array is not incremented?
Share edited Oct 26, 2019 at 12:02 Udochukwu Enwerem 2,8732 gold badges26 silver badges36 bronze badges asked Oct 26, 2019 at 11:19 Henok TesfayeHenok Tesfaye 9,62015 gold badges51 silver badges92 bronze badges 1- you can assign arbitrary properties to array,and since it is not a true index of the array(negative index don't act like index),array length keep the same. – Rajith Thennakoon Commented Oct 26, 2019 at 11:30
4 Answers
Reset to default 4Extract the definition from MDN Array.length
The length property of an object which is an instance of type Array sets or returns the number of elements in that array. The value is an unsigned, 32-bit integer that is always numerically greater than the highest index in the array.
It only counts from a 0-based index up to the highest numerical. So anything invalid or negative is ignore. So, the length is not really the number of elements that you visually see.
Relationship with the length
property MDN
If the only argument passed to the Array constructor is an integer between 0 and 232-1 (inclusive), this returns a new JavaScript array with its length property set to that number (Note: this implies an array of arrayLength empty slots, not slots with actual undefined values). If the argument is any other number, a RangeError exception is thrown.
Arrays cannot use strings as element indexes (as in an associative array) but must use integers. Setting or accessing via non-integers using bracket notation (or dot notation) will not set or retrieve an element from the array list itself, but will set or access a variable associated with that array's object property collection. The array's object properties and list of array elements are separate, and the array's traversal and mutation operations cannot be applied to these named properties.
Here indicates the differences between setting a value and a property. Anything set beyond the valid range of the array is considered property, just like anything other objects
Arrays are numerically indexed, but the tricky thing is that they also are objects that can have string keys/properties added to them (but which don't count toward the length of the array):
Indexes in an array start from 0
onwards, i.e. they can be positive.
When you try to access a negative index in an array like a[-1]
it will act as a key and -1
will get stored as "-1"
key in array which is also an object. But it won't count towards the length of the array, only numerical indexes will get counted towards length of the array. You can call array as a subtype of objects in javascript
let a = [];
a[0] = 9;
a[-1] = 10; // will act as key "-1" added to object a with value 10
console.log(a); //[9]
console.log(a["-1"]); //10
console.log("Length of array " + a.length); // 1, key "-1" will not contribute to the lenght of an array
I would remend you to go through the article
Only to make it clear that this is standard behavior and not a kind of lazy developper job, here are some official spec concerning the Array object :
Assert: IsPropertyKey(P) is true.
If P is "length", then
Return ? ArraySetLength(A, Desc).
Else if P is an array index, then
Let oldLenDesc be OrdinaryGetOwnProperty(A, "length").
Assert: oldLenDesc will never be undefined or an accessor descriptor because Array objects are created with length data property that cannot be deleted or reconfigured.
Let oldLen be oldLenDesc.[[Value]].
Let index be ! ToUint32(P).
If index ≥ oldLen and oldLenDesc.[[Writable]] is false, return false.
Let succeeded be ! OrdinaryDefineOwnProperty(A, P, Desc).
If succeeded is false, return false.
If index ≥ oldLen, then
Set oldLenDesc.[[Value]] to index + 1.
Let succeeded be OrdinaryDefineOwnProperty(A, "length", oldLenDesc).
Assert: succeeded is true.
Return true.
Return OrdinaryDefineOwnProperty(A, P, Desc).
So the array will treat the key as an array index if it matches the array index definition, which is :
An integer index is a String-valued property key that is a canonical numeric String (see 7.1.16) and whose numeric value is either +0 or a positive integer ≤ 253 - 1. An array index is an integer index whose numeric value i is in the range +0 ≤ i < 232 - 1.
Otherwise it will treat the key as an ordinary property.
source : http://www.ecma-international/publications/files/ECMA-ST/ECMA-262.pdf
The best way to explain this is simply to state that There are no negative indices in java script.
Just for pleteness the aforementioned references already explains this
An integer index is a String-valued property key that is a canonical numeric String (see 7.1.16) and whose numeric value is either +0 or a positive integer ≤ 253 - 1. An array index is an integer index whose numeric value i is in the range +0 ≤ i < 232 - 1.
However, you can also prove this using Javascript Iterators. Every Javascript array has two methods (that return iterators) to keep track of the values and keys in the array. They are the array.values()
and array.keys()
methods respective.
So, simply iterating through the values and keys of an array is enough proof to show that any added "negative index" isn't a key-value pair in the array but only a property
let a = [1, 2, 3, 4];
a[-1] = 5;
let valueIterator = a.values();
let keyIterator = a.keys();
let valueCount = keyCount = 0;
while(!valueIterator.next().done) {
valueCount++;
}
while(!keyIterator.next().done) {
keyCount++;
}
console.log("Number of values: " + valueCount); // 4
console.log("Number of keys: " + keyCount); // 4