I have this java simulator that will need to handle a huge amount of data. It works fine, but one I get up to an array of int[100000][100][2]
along with other big arrays. the program says that its out of memory. (Java.lang.outOfMemoryError
)
All fine and good, I just give it more memory, but It seems to always run out around ~300M even though I allow it 2GB. This is all from watching Task manager.
Is there something wrong with my system, or is this just a java thing that I need to deal with?
@DanielPryden
OS: Win 7 32Bit 4GB of ram on board
JVM Command: java -Xmx2048M -Xms2048M Simulator
Error Data: Had to get from an IDE (using IntelliJ). I dont know how to do it from cmd. I assume this is what you are looking for.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at Simulator.main(Simulator.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
If you're running on a 32-bit Windows OS, it's going to be impossible to allocate a full 2GB. Even if you muck about with Windows internals, the largest usable address space you'll ever get is 3GB, and even then it won't all be contiguous (and the JVM requires contiguous space to build the Java heap). In practice, with the Sun/Oracle JVM, I've never been able to successfully allocate a heap any bigger than about 1.5GB -- and if you're using JNI at all, the maximum possible heap is reduced by any DLLs you link in.
If you really need a big heap, I would first recommend you move to a 64-bit OS if at all possible. Secondly, as other answers here point out, if you can allocate non-contiguous memory, that will be more likely to succeed. Use a LinkedList
or other structure that allocates data in separate chunks. There is a bit of a trade-off here; you probably want each chunk to contain an array that's at least 64Kb.
Finally, you might get better results if you can find a way to split up your processing into separate processes -- that is, run multiple Java instances, each with its own set of data to operate on, and use sockets or files to communicate between them. That said, with a 32-bit Windows OS, you still probably won't be able to make use of your 4GB of RAM, let alone any larger amount.
You may be running into heap fragmentation issues. Even if you set -Xmx2GB, the integer array above requires a block contiguous memory. I would suggest also pinning the minimum heap size as well, for example, -Xms2GB. This of course requires that your machine actually have well above 2GB, because of OS overhead, other processes etc.
Alternatively, you might revisit your data structure to see if you really need such a contiguous block. Breaking it down in some way may reduce the requirement for large contiguous blocks of memory.
Heap memory is divided between three spaces:
- Old Generation
- Survivor Space
- Eden Space
By default, the virtual machine grows or shrinks the heap at each collection to try to keep the proportion of free space to live objects at each collection within a specific range. This target range is set as a percentage by the parameters -XX:MinHeapFreeRatio= and -XX:MaxHeapFreeRatio=, and the total size is bounded below by -Xms and above by -Xmx.
Default ratio in my jvm(1.6.26) is 30/70 so max size of object in old generation is limited (with -Xmx1G) by 700Mb.
However you could size generations using jvm options. For example you could run your class with parameters -Xmx1G -XX:NewRatio=10 and you could place bigger object in memory.
It looks like Java wasn't designed to hold large monolithic objects in memory(like arrays). Typical usage of memory in application is graph of bunch of relatively small objects and typically you'll get OutOfMemoryError only if you run out of space in all of spaces.
Below are couple useful (and interesting to read) articles:
Ergonomics in the 5.0 Java[tm] Virtual Machine
Tuning Garbage Collection with the 5.0 Java[tm] Virtual Machine
EDIT: I can't reproduce it on my box (Mac OS X, 8Gb RAM). Taking into account that single array takes roughly 200Mb+ in memory I agree with brettw that not having contiguous block of memory of this size causing this issue(not generation size). As for fix - use collection together with plain arrays or buy more memory (recommended :-) )