gpt4 book ai didi

amazon-web-services - Spark 无法使用公共(public) IP 在端口 7077 上绑定(bind)

转载 作者:行者123 更新时间:2023-12-03 17:39:35 25 4
gpt4 key购买 nike

我已经在 AWS 上安装了 spark。
当我尝试在 AWS 上执行时,它可以工作,但 spark 不起作用,当我检查 sparkMaster 日志时,我看到了下一个:

Spark Command: /usr/lib/jvm/java-8-oracle/jre/bin/java -cp /home/ubuntu/spark/conf/:/home/ubuntu/spark/jars/* -Xmx1g org.apache.spark$
========================================
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/09/12 09:40:18 INFO Master: Started daemon with process name: 5451@server1
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for TERM
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for HUP
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for INT
16/09/12 09:40:18 WARN MasterArguments: SPARK_MASTER_IP is deprecated, please use SPARK_MASTER_HOST
16/09/12 09:40:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where a$
16/09/12 09:40:19 INFO SecurityManager: Changing view acls to: ubuntu
16/09/12 09:40:19 INFO SecurityManager: Changing modify acls to: ubuntu
16/09/12 09:40:19 INFO SecurityManager: Changing view acls groups to:
16/09/12 09:40:19 INFO SecurityManager: Changing modify acls groups to:
16/09/12 09:40:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set$
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7077. Attempting port 7078.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7078. Attempting port 7079.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7079. Attempting port 7080.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7080. Attempting port 7081.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7081. Attempting port 7082.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7082. Attempting port 7083.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7083. Attempting port 7084.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7084. Attempting port 7085.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7085. Attempting port 7086.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7086. Attempting port 7087.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7087. Attempting port 7088.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7088. Attempting port 7089.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7089. Attempting port 7090.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7090. Attempting port 7091.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7091. Attempting port 7092.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7092. Attempting port 7093.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'sparkMaster' failed after 16 retries! Co$
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)

我的/etc/hosts 是下一个:
127.0.0.1  localhost

52.211.60.97 server1
52.210.246.199 client1
52.211.71.126 client2
52.211.20.213 client3

# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters

这是我的 spark-env.sh:
export SPARK_WORKER_MEMORY=512m
export SPARK_EXECUTOR_MEMORY=512m
export SPARK_WORKER_INSTANCES=1
export SPARK_WORKER_CORES=1
export SPARK_WORKER_DIR=/home/ubuntu/spark
export SPARK_LOCAL_IP=52.211.60.97
export SPARK_MASTER_IP=52.211.60.97
export SPARK_MASTER_WEBUI_PORT=4041

我尝试过相同的剧本,但使用带有私有(private)实例和 VPN 的 AWS VPC,它运行良好。所以我认为公共(public)IP有什么问题,也许亚马逊封锁了公共(public)IP上的一些端口?或者可能是什么问题?

最佳答案

我也面临着类似的问题。
这是因为 spark master 无法打开 SPARK_MASTER_IP 上指定的端口。
首先,通过 hostname 找到您的主机名命令。
之后,确保您的机器 ip 地址在 /etc/hosts映射到给定的主机名。
之后,将该主机名用于 SPARK_MASTER_IP。
对于这个问题,集群模式下也可以提export SPARK_LOCAL_IP=127.0.0.1 .
PS :- 我知道回复晚了,但可以帮助其他来这里的人。

关于amazon-web-services - Spark 无法使用公共(public) IP 在端口 7077 上绑定(bind),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/39447593/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com