gpt4 book ai didi

java - 在 ubuntu 上设置 Hadoop YARN(单节点)

转载 作者:可可西里 更新时间:2023-11-01 16:13:55 25 4
gpt4 key购买 nike

我在 Ubuntu 13 上将 Hadoop YARN (2.5.1) 设置为单节点集群。当我运行 start-dfs.sh 时,它给出以下输出并且进程没有启动(我确认使用 jps 和 ps 命令)。我的 bashrc 设置也复制在下面。关于我需要重新配置的内容有什么想法吗?

bashrc 添加:

export JAVA_HOME=/usr/lib/jvm/java-8-oracle
export HADOOP_INSTALL=/opt/hadoop/hadoop-2.5.1
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"

start-dfs.sh 输出:

14/09/22 12:24:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /opt/hadoop/hadoop-2.5.1/logs/hadoop-hduser-namenode-zkserver1.fidelus.com.out
localhost: nice: $HADOOP_INSTALL/bin/hdfs: No such file or directory
localhost: starting datanode, logging to /opt/hadoop/hadoop-2.5.1/logs/hadoop-hduser-datanode-zkserver1.fidelus.com.out
localhost: nice: $HADOOP_INSTALL/bin/hdfs: No such file or directory
Starting secondary namenodes [0.0.0.0]
The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.
ECDSA key fingerprint is cf:e1:ea:86:a4:0c:cd:ec:9d:b9:bc:90:9d:2b:db:d5.
Are you sure you want to continue connecting (yes/no)? yes
0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the list of known hosts.
0.0.0.0: starting secondarynamenode, logging to /opt/hadoop/hadoop-2.5.1/logs/hadoop-hduser-secondarynamenode-zkserver1.fidelus.com.out
0.0.0.0: nice: $HADOOP_INSTALL/bin/hdfs: No such file or directory
14/09/22 12:24:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

bin 目录有 hdfs 文件,它的所有者是 hduser(我以 hduser 身份运行该进程)。 我的 $HADOOP_INSTALL 设置指向 hadoop 目录 (/opt/hadoop/hadoop-2.5.1)。我是否应该更改权限、配置或只是将目录从 opt 中移出并移至/usr/local?

更新:当我运行 start-yarn.sh 时,我收到以下消息:

localhost: Error: Could not find or load main class org.apache.hadoop.yarn.server.nodemanager.NodeManager

更新我将目录移动到/usr/local 但我收到相同的警告消息。

更新我有 ResourceManager 按照 jps 命令运行。但是,当我尝试启动 yarn 时,它失败并出现上面给出的错误。我可以访问端口 8088 上的资源管理器用户界面。有什么想法吗?

最佳答案

尝试使用以下命令运行 namenode(而不是使用 start-dfs.sh),看看是否可行。

    hadoop-daemon.sh start namenode
hadoop-daemon.sh start secondarynamenode
hadoop-daemon.sh start datanode
hadoop-daemon.sh start nodemanager
mr-jobhistory-daemon.sh start historyserver

关于java - 在 ubuntu 上设置 Hadoop YARN(单节点),我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/25978863/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com