gpt4 book ai didi

hadoop - Apache Spark : Error on JavaSparkContext. 停止()

转载 作者:可可西里 更新时间:2023-11-01 16:12:55 26 4
gpt4 key购买 nike

当我的spark程序调用JavaSparkContext.stop()时,出现如下错误。

14/12/11 16:24:19 INFO Main: sc.stop {
14/12/11 16:24:20 ERROR ConnectionManager: Corresponding SendingConnection to ConnectionManagerId(cluster02,38918) not found
14/12/11 16:24:20 ERROR SendingConnection: Exception while reading SendingConnection to ConnectionManagerId(cluster04,59659)
java.nio.channels.ClosedChannelException
at sun.nio.ch.SocketChannelImpl.ensureReadOpen(SocketChannelImpl.java:252)
at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:295)
at org.apache.spark.network.SendingConnection.read(Connection.scala:390)
at org.apache.spark.network.ConnectionManager$$anon$6.run(ConnectionManager.scala:205)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
14/12/11 16:24:20 ERROR ConnectionManager: Corresponding SendingConnection to ConnectionManagerId(cluster03,59821) not found
14/12/11 16:24:20 ERROR ConnectionManager: Corresponding SendingConnection to ConnectionManagerId(cluster02,38918) not found
14/12/11 16:24:20 WARN ConnectionManager: All connections not cleaned up
14/12/11 16:24:20 INFO Main: sc.stop }

我该如何解决这个问题?

配置如下:

  • Spark 版本为 1.1.1
  • 客户端在 Windows 7 上运行
  • 集群为Linux(CentOS 6.5)。
  • spark.master=yarn-client
  • 由于 Spark 在将作业从 Windows 提交到 Linux 时出现问题,我将补丁应用于 Spark 源代码。 (请参阅 https://github.com/apache/spark/pull/899 )

更新

Spark客户端在Linux上运行时,出现如下错误。 (我认为基本上都是一样的错误)

14/12/12 11:32:02 INFO Main: sc.stop {
14/12/12 11:32:02 INFO SparkUI: Stopped Spark web UI at http://clientmachine:4040
14/12/12 11:32:02 INFO DAGScheduler: Stopping DAGScheduler
14/12/12 11:32:02 INFO YarnClientSchedulerBackend: Shutting down all executors
14/12/12 11:32:02 INFO YarnClientSchedulerBackend: Asking each executor to shut down
14/12/12 11:32:02 INFO YarnClientSchedulerBackend: Stopped
14/12/12 11:32:03 INFO ConnectionManager: Removing SendingConnection to ConnectionManagerId(cluster04,52869)
14/12/12 11:32:03 INFO ConnectionManager: Removing ReceivingConnection to ConnectionManagerId(cluster04,52869)
14/12/12 11:32:03 ERROR ConnectionManager: Corresponding SendingConnection to ConnectionManagerId(cluster04,52869) not found
14/12/12 11:32:03 INFO ConnectionManager: Removing SendingConnection to ConnectionManagerId(cluster03,57334)
14/12/12 11:32:03 INFO ConnectionManager: Removing ReceivingConnection to ConnectionManagerId(cluster03,57334)
14/12/12 11:32:03 ERROR ConnectionManager: Corresponding SendingConnection to ConnectionManagerId(cluster03,57334) not found
14/12/12 11:32:03 INFO ConnectionManager: Removing SendingConnection to ConnectionManagerId(cluster02,54205)
14/12/12 11:32:03 INFO ConnectionManager: Removing ReceivingConnection to ConnectionManagerId(cluster02,54205)
14/12/12 11:32:03 ERROR ConnectionManager: Corresponding SendingConnection to ConnectionManagerId(cluster02,54205) not found
14/12/12 11:32:03 INFO MapOutputTrackerMasterActor: MapOutputTrackerActor stopped!
14/12/12 11:32:03 INFO ConnectionManager: Selector thread was interrupted!
14/12/12 11:32:03 INFO ConnectionManager: Removing ReceivingConnection to ConnectionManagerId(cluster02,54205)
14/12/12 11:32:03 ERROR ConnectionManager: Corresponding SendingConnection to ConnectionManagerId(cluster02,54205) not found
14/12/12 11:32:03 INFO ConnectionManager: Removing ReceivingConnection to ConnectionManagerId(cluster04,52869)
14/12/12 11:32:03 ERROR ConnectionManager: Corresponding SendingConnection to ConnectionManagerId(cluster04,52869) not found
14/12/12 11:32:03 WARN ConnectionManager: All connections not cleaned up
14/12/12 11:32:03 INFO ConnectionManager: ConnectionManager stopped
14/12/12 11:32:03 INFO MemoryStore: MemoryStore cleared
14/12/12 11:32:03 INFO BlockManager: BlockManager stopped
14/12/12 11:32:03 INFO BlockManagerMaster: BlockManagerMaster stopped
14/12/12 11:32:03 INFO SparkContext: Successfully stopped SparkContext
14/12/12 11:32:03 INFO Main: sc.stop }

最佳答案

有些线程建议在执行 Spark 上下文“停止”之前放置一个 Thread.sleep。看看是否有帮助。

关于hadoop - Apache Spark : Error on JavaSparkContext. 停止(),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27417573/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com