Just a few minutes back I answered a question asking about the "Maximum possible size of HashMap in Java". As I have always read, HashMap is a growable data-structure. It's size is only limited by the JVM memory size. Hence I thought that there is no hard limit to its size and answered accordingly. (The same is applicable to HashSet as well.)
But someone corrected me saying that since the size() method of HashMap returns an int, there is a limit on its size. A perfectly correct point. I just tried to test it on my local but failed, I need more than 8GB memory to insert more than 2,147,483,647 integers in the HashMap, which I don't have.
My questions were:
- What happens when we try to insert 2,147,483,647 + 1 element in the
HashMap/HashSet?
- Is there an error thrown?
- If yes, which error? If not what happens to the HashMap/HashSet, its already
existing elements and the new element?
If someone is blessed with access to a machine with say 16GB memory, you can try it out practically. :)
The underlying capacity of the array has to be a power of 2 (which is limited to 2^30) When this size is reached the load factor is effectively ignored and array stops growing.
At this point the rate of collisions increases.
Given the hashCode() only has 32-bits it wouldn't make sense to grow much big that this in any case.
/**
* Rehashes the contents of this map into a new array with a
* larger capacity. This method is called automatically when the
* number of keys in this map reaches its threshold.
*
* If current capacity is MAXIMUM_CAPACITY, this method does not
* resize the map, but sets threshold to Integer.MAX_VALUE.
* This has the effect of preventing future calls.
*
* @param newCapacity the new capacity, MUST be a power of two;
* must be greater than current capacity unless current
* capacity is MAXIMUM_CAPACITY (in which case value
* is irrelevant).
*/
void resize(int newCapacity) {
Entry[] oldTable = table;
int oldCapacity = oldTable.length;
if (oldCapacity == MAXIMUM_CAPACITY) {
threshold = Integer.MAX_VALUE;
return;
}
Entry[] newTable = new Entry[newCapacity];
transfer(newTable);
table = newTable;
threshold = (int)(newCapacity * loadFactor);
}
When the size exceeds Integer.MAX_VALUE it overflows.
void addEntry(int hash, K key, V value, int bucketIndex) {
Entry<K,V> e = table[bucketIndex];
table[bucketIndex] = new Entry<K,V>(hash, key, value, e);
if (size++ >= threshold)
resize(2 * table.length);
}