gpt4 book ai didi

java - Ubuntu 终端中的 "insufficient memory for Java Runtime Environment to continue"

转载 作者:行者123 更新时间:2023-11-30 06:23:31 25 4
gpt4 key购买 nike

我正在尝试运行java -Xmx5g -cp stanford-corenlp-3.8.0.jar:stanford-corenlp-models-3.8.0.jar:* edu.stanford.nlp.pipeline.StanfordCoreNLP -annotators tokenize,ssplit,pos,lemma,ner,parse,mention,coref -coref.algorithm neural -file example_file.txt查找文本中对同一实体的提及。但是当我在终端中运行该命令时,该进程被终止,并且错误被写入日志中,指出内存不足,无法继续运行 Java 运行时环境。

我正在使用 Ubuntu:

java version "1.8.0_151".

Java(TM) SE Runtime Environment (build 1.8.0_151-b12)

Java Hotspot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)

由于日志相当长,问题正文无法容纳日志的所有详细信息。

这是日志:error log

[更新]我增加了虚拟机的物理内存。现在我收到此错误:

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:3332)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:649)
at java.lang.StringBuilder.append(StringBuilder.java:202)
at edu.stanford.nlp.ling.SentenceUtils.listToString(SentenceUtils.java:186)
at edu.stanford.nlp.ling.SentenceUtils.listToString(SentenceUtils.java:169)
at edu.stanford.nlp.ling.SentenceUtils.listToString(SentenceUtils.java:148)
at edu.stanford.nlp.pipeline.ParserAnnotator.doOneSentence(ParserAnnotator.java:360)
at edu.stanford.nlp.pipeline.ParserAnnotator.doOneSentence(ParserAnnotator.java:254)
at edu.stanford.nlp.pipeline.SentenceAnnotator.annotate(SentenceAnnotator.java:102)
at edu.stanford.nlp.pipeline.AnnotationPipeline.annotate(AnnotationPipeline.java:76)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.annotate(StanfordCoreNLP.java:599)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.annotate(StanfordCoreNLP.java:609)
at edu.stanford.nlp.pipeline.StanfordCoreNLP$$Lambda$55/45416784.accept(Unknown Source)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.processFiles(StanfordCoreNLP.java:1172)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.processFiles(StanfordCoreNLP.java:945)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.run(StanfordCoreNLP.java:1274)
at edu.stanford.nlp.pipeline.StanfordCoreNLP.main(StanfordCoreNLP.java:1345)

有办法解决这个问题吗?

最佳答案

错误报告是这样说的:

# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 335785984 bytes
for committing reserved memory.
# Possible reasons:
# The system is out of physical RAM or swap space
# In 32 bit mode, the process size limit was hit

从表面上看:

  • 第一个解释意味着操作系统拒绝了 JVM 分配大块 native 内存的请求,因为资源(物理内存或交换空间)不可用。

  • 您使用的是 64 位 JVM,因此第二种可能的解释不适用。

第一个解释是合理的。可能的修复可能是:

  • 添加更多物理内存;例如获取更大的机器或虚拟机
  • 添加更多交换空间
  • 减少通过 -Xmx 参数指定的最大堆大小

关于java - Ubuntu 终端中的 "insufficient memory for Java Runtime Environment to continue",我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47671697/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com