import os
import xml.etree.ElementTree as et
for ev, el in et.iterparse(os.sys.stdin):
el.clear()
Running the above on the ODP structure RDF dump results in always increasing memory. Why is that? I understand ElementTree still builds a parse tree, albeit with the child nodes clear()
ed. If that is the cause of this memory usage pattern, is there a way around it?
As mentioned in the answer by Kevin Guerra, the "root.clear()" strategy in the ElementTree documentation only removes fully parsed children of the root. If those children are anchoring huge branches, it's not very helpful.
He touched on the ideal solution, but didn't post any code, so here is an example:
The element of interest will not have subelements; they'll have been removed as soon as their end tags were seen. This might be OK if all you need is the element's text or attributes.
If you want to query into the element's descendants, you need to let a full branch be built for it. For this, maintain a flag, implemented as a depth counter for those elements. Only call .remove() when the depth is zero:
You are
clear
ing each element but references to them remain in the root document. So the individual elements still cannot be garbage collected. See this discussion in the ElementTree documentation.The solution is to clear references in the root, like so:
Another thing to remember about memory usage, which may not be affecting your situation, is that once the VM allocates memory for heap storage from the system, it generally never gives that memory back. Most Java VMs work this way too. So you should not expect the size of the interpreter in
top
orps
to ever decrease, even if that heap memory is unused.I ran into the same issue. The documentation doesn't make things very clear. The issue in my case was:
1) Calling clear does release memory for the children nodes. Documentation says that it releases all memory. Clear does not release the memory for which clear is called, because that memory belongs to the parent which allocated it. 2) Calling root.clear(), that depends on what root is. If root is the parent then it would work. Otherwise, it will not free the memory.
The fix was to keep a reference to the parent, and when we no longer need the node, we call parent.remove(child_node). This worked and it kept the memory profile at a few KBs.