Can i use more heap than 32 GB with compressed oop

2020-02-28 07:43发布

问题:

I could understand that with compressed oops , we can only use 32 GB of RAM. Is there someway i can use more than that like by allocating 2 heap or something ?

Thanks Vineeth

回答1:

You can't have multiple heaps (you can have multiple JVMs though, which is called scaling out as opposed to scaling up).

JVM uses compressed object pointers automatically below 32 GiB of memory. If you understand how it works (dropping youngest three bits from each address as they are always 0 due to memory alignment) you'll understand that you can't go further.

There is an interesting fact: once you exceed this 32 GiB border JVM will stop using compressed object pointers, effectively reducing the available memory. That means increasing your JVM heap above 32 GiB you must go way above. According to great Everything I Ever Learned about JVM Performance Tuning @twitter (around 13:00) presentation increasing heap from 32 GiB to anything below 48 GiB will actually decrease the amount of available memory (!) because compressed object pointers are no longer there.



回答2:

If you need more than 32 GB, I suggest you consider using some off heap memory. This effectively gives you an additional memory space which doesn't use up much heap.

For example, I routinely use 200-800 GB but only 1-2 GB of that is heap. This means I have most efficient form of compressed Oops and virtually unlimited capacity. Note: there are three forms of compressed Oops,

  • plain 32-bit unshifted (up to ~2 GB)
  • 32-bit shifted (up to ~26 GB)
  • 32-bit shifted and offset (up to ~32 GB)

Two ways of using off heap memory are direct memory ByteBuffers and memory mapped files. Direct memory scale well up to about 3/4 of your main memory size. Memory mapped files scale well up to the size of you hard drive space (which is typically much more)

Here i have applied many optimization such that at the end 80% of space is eaten up by the references rather than the actual data.

That sounds like you are not using the most efficient data structures. You can use different data structures where data is more of the spaces used or at least 2/3rds.



回答3:

You can use larger heap sizes with an additional parameter: -XX:ObjectAlignmentInBytes=alignment

This parameter is a fixed adjustment of Java objects. The default value is 8 (bytes). The indicated value has to be a power of two, and ranges from 8 to 256.

The heap size limit in bytes is calculated as:

4GB * ObjectAlignmentInBytes

A 64GB heap size will be available for compressed pointers with the following line:

-XX:ObjectAlignmentInBytes=16

There is a note to be considered in the documentation for larger heap sizes though:

Note: As the alignment value increases, the unused space between objects will also increase. As a result, you may not realize any benefits from using compressed pointers with large Java heap sizes.



回答4:

If I were in your shoes, I'd investigate each of the following:

  1. Not using compressed oops.
  2. Reducing your application's memory consumption (a memory profiler is an extremely handy tool for investigating memory usage).
  3. Splitting the workload across multiple JVMs, each with a sub-32GB heap.

Each of the above has the potential for solving your problem. Which is the most appropriate is really hard for us to say.

80% of space is eaten up by the references rather than the actual data.

This sounds rather extreme. It may be worth revisiting your data structures, with the emphasis on reducing the number of object references. I've done things along these lines in the past, but it's very hard to give specific recommendations without knowing your problem and the data structures you're currently using.



回答5:

What exactly is the nature of the data?

The way to do this might be to store the data off the Java heap. You can do this by getting hold of some off-heap memory, typically using a direct ByteBuffer, and then storing data in it in the form of bytes. This has various advantages; the objects can be stored very compactly, you don't need to have a huge heap, and the objects won't be swept by the garbage collector. The disadvantage is the complexity, and the risk of memory leaks.

There are libraries which can help you do this, including:

  • http://code.google.com/p/vanilla-java/wiki/HugeCollections
  • http://directmemory.apache.org/