gpt4 book ai didi

scala - 异常 : ERROR SparkContext - Error initializing local SparkContext. java.net.BindException

转载 作者:行者123 更新时间:2023-11-28 19:56:37 29 4
gpt4 key购买 nike

我正在尝试为 spark 应用程序编写测试,但在尝试运行下一个测试时出现此异常

     class BasicIT {

val sparkConf: SparkConf = new SparkConf().setAppName("basic.phase.it").setMaster("local[1]")
var context:SparkContext = new SparkContext(sparkConf)
@Test
def myTest(): Unit = {
print("test")
}
}

失败并出现此异常:

2016-07-24 21:04:39,956 [main,95] ERROR SparkContext - Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)

java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!

目前在 OS x Yosemite 上使用 IntelliJ。

我做错了什么?用于工作的相同代码..

最佳答案

尝试使用将 SPARK_LOCAL_IP="127.0.0.1"导出到 load-spark-env.sh 或在运行 spark 应用程序之前设置 SPARK_LOCAL_IP="127.0.0.1"。它对我有用。

关于scala - 异常 : ERROR SparkContext - Error initializing local SparkContext. java.net.BindException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38555288/

29 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com