This question already has an answer here:
as far as I know the well known Instrumentation Java method is unable to correctly calculate the deep size of an object.
Is there a reliable way to compute on the JVM the correct deep size of an object?
The use case I'm thinking about is a fixed (or upper bounded) memory size data structure, i.e. a cache.
Note: as far as possible, I would like an enterprise-ready solution, so either a "standard" coding practice or a well tested library
With Instrumentation alone, no.
With instrumentation and a knowledge of how the memory of a particular JVM is laid out will give your number of bytes used. It won't tell you how other JVMs might work and it doesn't tell you how much data is shared.
I use a profiler, but unless you believe some of the tools you use you can never know.
How slow are you willing to make your cache for precise memory usage? If it is 10x or 100x slower but has very accurate usage is this better than something which just counts the number of elements?
In that case, use the element count. You can use LinkedHashMap or ehcache for this.