I was making a large Map
in Node.js v11.9.0 and it kept failing with "FATAL ERROR: invalid table size Allocation failed - JavaScript heap out of memory". My map's keys and values shouldn't be getting anywhere near the size of Node's heap size, so I tried just making a map and inserting numeric keys and values into it:
var N = Math.pow(2, 26);
var map = new Map();
for (var i = 0; i < N; i++) {
map.set(i, i + 1);
if (i % 1e5 === 0) { console.log(i / 1e6); }
}
This program crashes Node after inserting roughly 16.6 million entries. That number seemed suspiciously close to 2^24, so replacing the logging above with if (i > 16777200) { console.log(i); }
, I see that the program crashes immediately after successfully printing "16777215", which is one less than 2^24.
Question. Is there a documented limit on the number of entries in Node's Map
close to 2^24? Is there any way to raise that limit?
(N.B. Running Node as node --max-old-space-size=4096
doesn't prevent the crash, since Node is using far less than 4 GB RAM.)
(N.B. 2. I don't think this is a hash collision issue since in my actual code, the map contains (short-ish) strings rather than numbers.)
(N.B. 3. Running the above programs in Firefox's JavaScript Console does not kill Firefox–Firefox keeps adding entries well past 30 million. However, Chrome crashes just like Node. So this is likely a V8 limitation.)
What's interesting is if you change your code to create two
Map
objects and insert into them simultaneously, they both crash at exactly the same point, 16.7:There's something odd happening here when more than 224 entries are made in any given Map, not globally across all Map objects.
I think you've found a V8 bug that needs to be reported.
V8 developer here. I can confirm that 2^24 is the maximum number of entries in a
Map
. That's not a bug, it's just the implementation-defined limit.The limit is determined by:
FixedArray
backing store of theMap
has a maximum size of 1GB (independent of the overall heap size limit)FixedArray
Map
needs 3 elements per entry (key, value, next bucket link), and has a maximum load factor of 50% (to avoid the slowdown caused by many bucket collisions), and its capacity must be a power of 2. 2^27 / (3 * 2) rounded down to the next power of 2 is 2^24, which is the limit you observe.FWIW, there are limits to everything: besides the maximum heap size, there's a maximum
String
length, a maximumArray
length, a maximumArrayBuffer
length, a maximumBigInt
size, a maximum stack size, etc. Any one of those limits is potentially debatable, and sometimes it makes sense to raise them, but the limits as such will remain. Off the top of my head I don't know what it would take to bump this particular limit by, say, a factor of two -- and I also don't know whether a factor of two would be enough to satisfy your expectations.