coreNLP
is an R package for interfacing with Standford's CoreNLP Java libraries. The first line one must execute (after loading the appropriate packages with the library()
command) is initCoreNLP()
. Unfortunately, this results in the following error:
Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... Error in rJava::.jnew("edu.stanford.nlp.pipeline.StanfordCoreNLP", basename(path)) : java.lang.OutOfMemoryError: GC overhead limit exceeded
Note, this is the same problem that is listed here: (initCoreNLP() method call from the Stanford's R coreNLP package throws error). In that case, however, the OP found that rebooting his machine made the problem disappear. This is not the case for me; I keep experiencing it even after a reboot.
Has anyone else run into this and can provide a solution or suggestion?
Thanks in advance, DG
CONFIG DETAILS:
R version 3.2.3 (2015-12-10)
rJava version 0.9-7
coreNLP version 0.4-1
Machine: Win 7 with 8GB RAM