Why do I get an OutOfMemoryError when inserting 50

2019-03-11 04:28发布

I am trying to insert about 50,000 objects (and therefore 50,000 keys) into a java.util.HashMap<java.awt.Point, Segment>. However, I keep getting an OutOfMemory exception. (Segment is my own class - very light weight - one String field, and 3 int fields).

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at java.util.HashMap.resize(HashMap.java:508)
    at java.util.HashMap.addEntry(HashMap.java:799)
    at java.util.HashMap.put(HashMap.java:431)
    at bus.tools.UpdateMap.putSegment(UpdateMap.java:168)

This seems quite ridiculous since I see that there is plenty of memory available on the machine - both in free RAM and HD space for virtual memory.

Is it possible Java is running with some stringent memory requirements? Can I increase these?

Is there some weird limitation with HashMap? Am I going to have to implement my own? Are there any other classes worth looking at?

(I am running Java 5 under OS X 10.5 on an Intel machine with 2GB RAM.)

10条回答
看我几分像从前
2楼-- · 2019-03-11 05:09

You probably need to set the flag -Xmx512m or some larger number when starting java. I think 64mb is the default.

Edited to add: After you figure out how much memory your objects are actually using with a profiler, you may want to look into weak references or soft references to make sure you're not accidentally holding some of your memory hostage from the garbage collector when you're no longer using them.

查看更多
爷的心禁止访问
3楼-- · 2019-03-11 05:12

Implicit in these answers it that Java has a fixed size for memory and doesn't grow beyond the configured maximum heap size. This is unlike, say, C, where it's constrained only by the machine on which it's being run.

查看更多
做自己的国王
4楼-- · 2019-03-11 05:18

Another thing to try if you know the number of objects beforehand is to use the HashMap(int capacity,double loadfactor) constructor instead of the default no-arg one which uses defaults of (16,0.75). If the number of elements in your HashMap exceeds (capacity * loadfactor) then the underlying array in the HashMap will be resized to the next power of 2 and the table will be rehashed. This array also requires a contiguous area of memory so for example if you're doubling from a 32768 to a 65536 size array you'll need a 256kB chunk of memory free. To avoid the extra allocation and rehashing penalties, just use a larger hash table from the start. It'll also decrease the chance that you won't have a contiguous area of memory large enough to fit the map.

查看更多
不美不萌又怎样
5楼-- · 2019-03-11 05:18

Also might want to take a look at this:

http://java.sun.com/docs/hotspot/gc/

查看更多
登录 后发表回答