2017-05-16 84 views
0

我正在使用斯坦福CoreNLP(在JAVA中)進行一些信息提取(使用OpenIE註釋器)。在運行我的代碼:(Java堆空間java.lang.OutOfMemoryError) - PFB我的代碼斯坦福CoreNLP depparse拋出OutofMemoryException

public void getInformation(String fileName){ 
    Properties prop = new Properties(); 
    prop.setProperty("annotators", "tokenize, ssplit, pos, lemma, depparse, natlog, openie"); 
    StanfordCoreNLP pipeline = new StanfordCoreNLP(prop); 
    Annotation annotation = new Annotation(IOUtils.slurpFileNoExceptions(fileName)); 
    pipeline.annotate(annotation); 
    pipeline.prettyPrint(annotation, out_data); 
    System.out.println("============================="); 
    System.out.println("The top level annotation"); 
    System.out.println(annotation.toString()); 

    List<CoreMap> sentences = (List<CoreMap>) annotation.get(CoreAnnotations.SentencesAnnotation.class); 
    if(sentences!=null && !sentences.isEmpty()) 
    { 
     CoreMap sentence = sentences.get(0); 

     Collection<RelationTriple> triples = sentence.get(NaturalLogicAnnotations.RelationTriplesAnnotation.class); 

     // Print the triples 
     for(RelationTriple triple : triples) 
     { 
      System.out.println(triple.confidence + "\t" + 
      triple.subjectLemmaGloss() + "\t" + 
      triple.relationLemmaGloss() + "\t" + 
      triple.objectLemmaGloss()); 
     } 

    } 
} 

,但我得到以下錯誤

INFO edu.stanford.nlp.parser.nndep.DependencyParser - Loading depparse model file: edu/stanford/nlp/models/parser/nndep/english_UD.gz ... 
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space 
at edu.stanford.nlp.parser.nndep.Classifier.preCompute(Classifier.java:661) 
at edu.stanford.nlp.parser.nndep.Classifier.preCompute(Classifier.java:643) 
at edu.stanford.nlp.parser.nndep.DependencyParser.initialize(DependencyParser.java:1168) 
at edu.stanford.nlp.parser.nndep.DependencyParser.loadModelFile(DependencyParser.java:605) 
at edu.stanford.nlp.parser.nndep.DependencyParser.loadFromModelFile(DependencyParser.java:498) 
at edu.stanford.nlp.pipeline.DependencyParseAnnotator.<init>(DependencyParseAnnotator.java:57) 
at edu.stanford.nlp.pipeline.AnnotatorImplementations.dependencies(AnnotatorImplementations.java:273) 
at edu.stanford.nlp.pipeline.AnnotatorFactories$18.create(AnnotatorFactories.java:480) 
at edu.stanford.nlp.simple.Document$5.get(Document.java:154) 
at edu.stanford.nlp.simple.Document$5.get(Document.java:148) 
at edu.stanford.nlp.simple.Document.runDepparse(Document.java:946) 
at edu.stanford.nlp.simple.Document.runNatlog(Document.java:966) 
at edu.stanford.nlp.simple.Document.runOpenie(Document.java:986) 
at edu.stanford.nlp.simple.Sentence.openieTriples(Sentence.java:890) 
at edu.stanford.nlp.simple.Sentence.openieTriples(Sentence.java:900) 
at com.automatics.nlp.OpenIEDemo.main(OpenIEDemo.java:18) 

我該如何克服這個異常?

謝謝!

回答

0

當你運行你的程序時,你需要給它至少2GB的RAM,可能更多取決於你正在使用的其他Stanford CoreNLP註釋器。你應該繼續添加RAM直到崩潰消失。

+0

謝謝!它在增加RAM後無一例外地執行。 –