Using Stanford CoreNLP

2019-03-27 18:21发布

I am trying to get around using the Stanford CoreNLP. I used some code from the web to understand what is going on with the coreference tool. I tried running the project in Eclipse but keep encountering an out of memory exception. I tried increasing the heap size but there isnt any difference. Any ideas on why this keeps happening? Is this a code specific problem? Any directions of using CoreNLP would be awesome.

EDIT - Code Added

import edu.stanford.nlp.dcoref.CorefChain;
import edu.stanford.nlp.dcoref.CorefCoreAnnotations;
import edu.stanford.nlp.pipeline.Annotation;
import edu.stanford.nlp.pipeline.StanfordCoreNLP;


import java.util.Iterator;
import java.util.Map;
import java.util.Properties;


public class testmain {

    public static void main(String[] args) {

        String text = "Viki is a smart boy. He knows a lot of things.";
        Annotation document = new Annotation(text);
        Properties props = new Properties();
        props.put("annotators", "tokenize, ssplit, pos, parse, dcoref");
        StanfordCoreNLP pipeline = new StanfordCoreNLP(props);
        pipeline.annotate(document);


        Map<Integer, CorefChain> graph = document.get(CorefCoreAnnotations.CorefChainAnnotation.class);



        Iterator<Integer> itr = graph.keySet().iterator();

        while (itr.hasNext()) {

             String key = itr.next().toString();

             String value = graph.get(key).toString();

             System.out.println(key + " " + value);      
        }

   }
}

3条回答
迷人小祖宗
2楼-- · 2019-03-27 18:55

I found similar problem when building small application using Stanford CoreNLP in Eclipse.
Increasing Eclipse's heap size will not solve your problem.
After doing search, it is ant build tool heap size that should be increased, but I have no idea how to do that.
So I give up Eclipse and use Netbeans instead.

PS: You will eventually get out of memory exception with default setting in Netbeans. But it can easily solved by adjust setting -Xms per application basis.

查看更多
劫难
3楼-- · 2019-03-27 18:58

Fix for eclipse: You can configure this in eclipse preference as follows

  1. Windows -> Preferences ( on mac it's: eclipse ->preferences)
  2. Java -> Installed JREs
  3. Select the JRE and click on Edit
  4. On the default VM arguments field, type in "-Xmx1024M". (or your memory preference, for 1GB of ram its 1024)
  5. Click on finish or OK.
查看更多
叼着烟拽天下
4楼-- · 2019-03-27 19:13

I think you can define the heap size in right-click->run->run-configurations under the VM arguments. i have tested it on mac and it works.

查看更多
登录 后发表回答