JVM deep memory size of an object [duplicate]

2019-07-31 02:28发布

问题:

This question already has an answer here:

  • In Java, what is the best way to determine the size of an object? 25 answers

as far as I know the well known Instrumentation Java method is unable to correctly calculate the deep size of an object.

Is there a reliable way to compute on the JVM the correct deep size of an object?

The use case I'm thinking about is a fixed (or upper bounded) memory size data structure, i.e. a cache.

Note: as far as possible, I would like an enterprise-ready solution, so either a "standard" coding practice or a well tested library

回答1:

I know the well known Instrumentation Java method is unable to correctly calculate the deep size of an object.

With Instrumentation alone, no.

With instrumentation and a knowledge of how the memory of a particular JVM is laid out will give your number of bytes used. It won't tell you how other JVMs might work and it doesn't tell you how much data is shared.

Is there a reliable way to compute on the JVM the correct deep size of an object?

I use a profiler, but unless you believe some of the tools you use you can never know.

The use case I'm thinking about is a fixed (or upper bounded) memory size data structure, i.e. a cache.

How slow are you willing to make your cache for precise memory usage? If it is 10x or 100x slower but has very accurate usage is this better than something which just counts the number of elements?

so either a "standard" coding practice or a well tested library

In that case, use the element count. You can use LinkedHashMap or ehcache for this.