Memory leaks parsing XML in r

2020-04-17 05:30发布

问题:

Memory leaks when using XML package in r is not something new. This subject has already been discussed:

  1. Serious Memory Leak When Iteratively Parsing XML Files
  2. http://www.omegahat.org/RSXML/MemoryManagement.html
  3. http://r.789695.n4.nabble.com/memory-leak-using-XML-readHTMLTable-td4643332.html

However, after reading all these documents, I still do not know a solution for my particular case. Consider the following code:

library(XML)

GetHref = function(x) 
{
  subDoc = xmlChildren(x)
  hrefs = ifelse(is.null(subDoc$a), NA, xmlGetAttr(subDoc$a, 'href')) 
  rm(subDoc)  
  return(hrefs)
}

url = 'http://www.atpworldtour.com/Share/Event-Draws.aspx?e=338&y=2013'
parse = htmlParse(url)

print(.Call("R_getXMLRefCount", parse)) #prints 1

NodeList = xpathSApply(parse, "//td[@class='col_1']/div/div/div[@class='player']")

print(.Call("R_getXMLRefCount", parse)) #prints 33

PlNames = sapply(NodeList, xmlValue, trim = T)   

print(.Call("R_getXMLRefCount", parse)) #prints 33

hrefs = sapply(NodeList, GetHref)

print(.Call("R_getXMLRefCount", parse)) #prints 157

rm(NodeList) 
gc()

print(.Call("R_getXMLRefCount", parse)) #prints 157

It seems that internal XML nodes created during the post processing do not get deleted. What would be a solution in this case?


Session Info:  
R version 3.0.2 (2013-09-25)
Platform: i386-w64-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_United States.1252  LC_CTYPE=English_United States.1252    LC_MONETARY=English_United States.1252
[4] LC_NUMERIC=C                           LC_TIME=English_United States.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] XML_3.98-1.1

loaded via a namespace (and not attached):
[1] tools_3.0.2

回答1:

I succeed in correcting a problem very similar as yours.

My document is a simple xml doc:

doc = xmlParse(file_path)

I apply the advise from Duncan Temple Lang about by-passing the memory management in collecting subnodes. For that purpose, I first gather subnodes with getNodeSet with deactivating finalizer:

nodeset = getNodeSet(doc, xml_path, addFinalizer = FALSE)

From this set, I can build a subdoc and free it without any memory leak:

subxml = subdoc(nodeset[[1]])
# ... do plenty of sapply
free(subxml)

At the end, I force the objects to be released, in that order:

free(doc)
rm(nodeset)

With all of this, I have no memory leak anylonger. Hope it can help!