If I set only a high index in an array, does it wa

2019-01-08 21:48发布

问题:

In Javascript, if I do something like

var alpha = [];
alpha[1000000] = 2;

does this waste memory somehow? I remember reading something about Javascript arrays still setting values for unspecified indices (maybe sets them to undefined?), but I think this may have had something to do with delete. I can't really remember.

回答1:

See this topic: are-javascript-arrays-sparse

In most implementations of Javascript (probably all modern ones) arrays are sparse. That means no, it's not going to allocate memory up to the maximum index.

If it's anything like a Lua implementation there is actually an internal array and dictionary. Densely populated parts from the starting index will be stored in the array, sparse portions in the dictionary.



回答2:

This is an old myth. The other indexes on the array will not be assigned.

When you assign a property name that is an "array index" (e.g. alpha[10] = 'foo', a name that represents an unsigned 32-bit integer) and it is greater than the current value of the length property of an Array object, two things will happen:

  1. The "index named" property will be created on the object.
  2. The length will be incremented to be that index + 1.

Proof of concept:

var alpha = [];
alpha[10] = 2;
alpha.hasOwnProperty(0);  // false, the property doesn't exist
alpha.hasOwnProperty(9);  // false
alpha.hasOwnProperty(10); // true, the property exist
alpha.length;             // 11

As you can see, the hasOwnProperty method returns false when we test the presence of the 0 or 9 properties, because they don't exist physically on the object, whereas it returns true for 10, the property was created.

This misconception probably comes from popular JS consoles, like Firebug, because when they detect that the object being printed is an array-like one, they will simply make a loop, showing each of the index values from 0 to length - 1.

For example, Firebug detects array-like objects simply by looking if they have a length property whose its value is an unsigned 32-bit integer (less than 2^32 - 1), and if they have a splice property that is a function:

console.log({length:3, splice:function(){}});
// Firebug will log: `[undefined, undefined, undefined]`

In the above case, Firebug will internally make a sequential loop, to show each of the property values, but no one of the indexes really exist and showing [undefined, undefined, undefined] will give you the false sensation that those properties exist, or that they were "allocated", but that's not the case...

This has been like that since ever, it's specified even of the ECMAScript 1st Edition Specification (as of 1997), you shouldn't worry to have implementation differences.



回答3:

About a year ago, I did some testing on how browsers handle arrays (obligatory self-promotional link to my blog post.) My testing was aimed more at CPU performance than at memory consumption, which is much harder to measure. The bottom line, though, was that every browser I tested with seemed to treat sparse arrays as hash tables. That is, unless you initialized the array from the get-go by putting values in consecutive indexes (starting from 0), the array would be implemented in a way that seemed to optimize for space.

So while there's no guarantee, I don't think that setting array[100000] will take any more room than setting array[1] -- unless you also set all the indexes leading up to those.



回答4:

I dont think so because javascript treats arrays kinda like dictionaries, but with integer keys.

alpha[1000000] = alpha["1000000"]



回答5:

I don't really know javascript, but it would be pretty odd behaviour if it DIDN'T allocate space for the entire array. Why would you think it wouldn't take up space? You're asking for a huge array. If it didn't give it to you, that would be a specific optimisation.

This obviously ignores OS optimisations such as memory overcommit and other kernel and implementation specifics.