Has anybody used Eclipse memory manager to detect memory leak in java codes? Can anybody recommend a good place to look for information regarding using memory manager? I read something online, it suggests that i need to let the program run until it crashes (out of memory error occurs), which will generates a crash report. Then use the memory manager to open this report to examine where the memory leak might occur. Is this how everybody uses memory manager?
相关问题
- Delete Messages from a Topic in Apache Kafka
- Jackson Deserialization not calling deserialize on
- How to maintain order of key-value in DataFrame sa
- StackExchange API - Deserialize Date in JSON Respo
- Difference between Types.INTEGER and Types.NULL in
Perhaps the simplest thing is to run your program under HProf (comes as standard with the JVM) for some time, and force an exit. The output of HProf should hopefully give you some immediate pointers re. your memory leak.
Unlike the Eclipse memory debugger (which I'm only knowledgeable about from what you write) you can gather statistics from any point in the execution.
I generally prefer profiling applications using the NetBeans Profiler. You can fairly easily see which objects are leaking and where they are created in most cases. There are likely several other tools that will do this as well, but I know the NetBeans profiler works well, and is easy to use.
This page explains working with jvm heap dumps. Jhat is a simpler, if less graphic way of working with heaps, but you can also load the same dump files into the eclipse memory manager. You can also get some information from jvisualvm, if you're using a current(1.6) jvm.
You can try using Jprobe. You can monitor your application and can look into the objects as they get created. Also this will help analysing what objects dont get garbage collect and will be pointers to move on.
Although it is not free, but I remember it comes with a trial licence, so check for that.
I don't think this is true - you won't get a dump file when an
OutOfMemoryError
occurs (I would bet the author is confusing this problem with some sort of JVM bug that would cause a core dump to be saved).The best procedure is to take a heap dump using jmap; this will output the contents of the heap to a binary file (usually known as hprof file). This file can then be analyzed by any number of analyzers:
I would highly recommend using the Eclipse plugin, as it is very quick to load large (> 500MB) heap dumps (in under a minute), produces useful reports, supports a query language with complex logic, etc.
Though -XX:+HeapDumpOnOutOfMemoryError can be useful, my current workflow for using the Eclipse Memory Manager is:
jmap -dump:format=b,file=dump.hprof <PID>
I usually start working with the histogram and dominator tree views to see if anything seems out of whack, then drill down from there.
VisualVM can be useful but seems much less efficient than EMM when working with a heap dump (EMM caches a lot of information on loading the heap dump). Netbeans Profiler is nice for getting the locations of allocations and for time profiling.