R's coreNLP::initCoreNLP() throws java.lang.Ou

2019-07-13 08:19发布

问题:

coreNLP is an R package for interfacing with Standford's CoreNLP Java libraries. The first line one must execute (after loading the appropriate packages with the library() command) is initCoreNLP(). Unfortunately, this results in the following error:

Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... Error in rJava::.jnew("edu.stanford.nlp.pipeline.StanfordCoreNLP", basename(path)) : java.lang.OutOfMemoryError: GC overhead limit exceeded

Note, this is the same problem that is listed here: (initCoreNLP() method call from the Stanford's R coreNLP package throws error). In that case, however, the OP found that rebooting his machine made the problem disappear. This is not the case for me; I keep experiencing it even after a reboot.

Has anyone else run into this and can provide a solution or suggestion?

Thanks in advance, DG

CONFIG DETAILS:

R version 3.2.3 (2015-12-10)

rJava version 0.9-7

coreNLP version 0.4-1

Machine: Win 7 with 8GB RAM

回答1:

Here is some documentation I found:

https://cran.r-project.org/web/packages/coreNLP/coreNLP.pdf

(specifically page 7)

You can specify how much memory you use (from the documentation):

initCoreNLP(libLoc, parameterFile, mem = "4g", annotators)

Add more memory and I would imagine the problem will go away.