gpt4 book ai didi

apache-spark - Spark - UbuntuVM - Java 运行时环境内存不足

转载 作者:行者123 更新时间:2023-12-01 06:11:44 25 4
gpt4 key购买 nike

我正在尝试在 Ubuntu14.04 VM 上安装 Spark1.5.1。解压缩文件后,我将目录更改为提取的文件夹并执行命令“./bin/pyspark”,该命令应该启动 pyspark shell。但我收到如下错误消息:

[ OpenJDK 64-Bit Server VM warning: INFO: os::commit_memory(0x00000000c5550000, 715849728, 0) failed; error='Cannot allocate memory' (errno=12) There is insufficient memory for the Java Runtime Environment to continue.

Native memory allocation (malloc) failed to allocate 715849728 bytes for committing reserved memory.

An error report file with more information is saved as: /home/datascience/spark-1.5.1-bin-hadoop2.6/hs_err_pid2750.log ]

有人可以给我一些解决问题的指导吗?

最佳答案

我们需要将 conf/spark-defaults.conf 文件中的 spark.executor.memory 设置为特定于您的计算机的值。例如,

usr1@host:~/spark-1.6.1$ cp conf/spark-defaults.conf.template conf/spark-defaults.conf
nano conf/spark-defaults.conf
spark.driver.memory 512m

更多信息请引用官方文档:http://spark.apache.org/docs/latest/configuration.html

关于apache-spark - Spark - UbuntuVM - Java 运行时环境内存不足,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33245529/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com