gpt4 book ai didi

shell - 线程 “delete Spark local dirs”中的异常java.lang.NullPointerException

转载 作者:行者123 更新时间:2023-12-02 21:35:52 24 4
gpt4 key购买 nike

嗨,我正在通过Shell脚本运行Sparkr Progrm。我将输入文件指向本地意味着它工作正常,但是当我指向hdfs时,它会引发错误。

Exception in thread "delete Spark local dirs" java.lang.NullPointerException

Exception in thread "delete Spark local dirs" java.lang.NullPointerException
at org.apache.spark.storage.DiskBlockManager.org$apache$spark$storage$DiskBlockManager$$doStop(DiskBlockManager.scala:161)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply$mcV$sp(DiskBlockManager.scala:141)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
at org.apache.spark.storage.DiskBlockManager$$anon$1$$anonfun$run$1.apply(DiskBlockManager.scala:139)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1617)
at org.apache.spark.storage.DiskBlockManager$$anon$1.run(DiskBlockManager.scala:139)

任何帮助将不胜感激。

最佳答案

我在Scala脚本中遇到了同样的问题。问题出在主URL,所以我删除了设置主URL。

先前:

val conf = new org.apache.spark.SparkConf().setMaster(masterURL).set("spark.ui.port",port).setAppName("TestScalaApp")

固定代码:
val conf = new org.apache.spark.SparkConf().setAppName("TestScalaApp")

关于shell - 线程 “delete Spark local dirs”中的异常java.lang.NullPointerException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/32349951/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com