gpt4 book ai didi

hadoop - 错误:找不到或加载主类org.apache.hadoop.hdfs.server.datanode.DataNode

转载 作者:行者123 更新时间:2023-12-02 20:39:20 27 4
gpt4 key购买 nike

我有Hadoop 2.7.1 e,它可以成功运行。接下来,我下载了apache-hive-2.1.1-bin,并编辑了“.bashrc”文件以更新用户的环境变量。现在,当我使用命令“* / sbin / start-dfs.sh”启动Hadoop时,出现错误:“无法找到或加载主类org.apache.hadoop.hdfs.server.datanode.DataNode”

Air-di-Danilo:2.7.1 danilogrifoni$ */sbin/start-dfs.sh
18/04/28 12:33:36 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /usr/local/Cellar/hadoop/2.7.1/libexec/logs/hadoop-danilogrifoni-namenode-Air-di-Danilo.out
localhost: Errore: impossibile trovare o caricare la classe principale org.apache.hadoop.hdfs.server.namenode.NameNode
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.1/libexec/logs/hadoop-danilogrifoni-datanode-Air-di-Danilo.out
localhost: Errore: impossibile trovare o caricare la classe principale org.apache.hadoop.hdfs.server.datanode.DataNode
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/Cellar/hadoop/2.7.1/libexec/logs/hadoop-danilogrifoni-secondarynamenode-Air-di-Danilo.out
0.0.0.0: Errore: impossibile trovare o caricare la classe principale org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode
18/04/28 12:34:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

这是我编辑的bashrc文件:
export HADOOP_HOME=/usr/local/Cellar/hadoop/2.7.1
export HADOOP_PREFIX=/usr/local/Cellar/hadoop/2.7.1/libexec
export HADOOP_CONF_DIR=/usr/local/Cellar/hadoop/2.7.1/libexec/etc/hadoop
export HADOOP_MAPRED_HOME=/usr/local/Cellar/hadoop/2.7.1
export HADOOP_COMMON_HOME=/usr/local/Cellar/hadoop/2.7.1
export HADOOP_HDFS_HOME=/usr/local/Cellar/hadoop/2.7.1
export YARN_HOME=/usr/local/Cellar/hadoop/2.7.1
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
export HADOOP_CLASSPATH=$(hadoop classpath):$HADOOP_CLASSPATH



export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_161.jdk/Contents/Home
export PATH=$PATH:/Library/Java/JavaVirtualMachines/jdk1.8.0_161.jdk/Contents/Home/bin



export PATH=$PATH:/usr/local/Cellar/hadoop/2.7.1/bin
export HADOOP_PID_DIR=/usr/local/Cellar/hadoop/2.7.1

# Set HIVE_HOME

export HIVE_HOME=/Users/danilogrifoni/Documents/apache-hive-2.1.1-bin
export PATH=$PATH:/Users/danilogrifoni/Documents/apache-hive-2.1.1-bin/bin

最佳答案

您是否在设置bashrc文件时运行source .bashrc命令?

关于hadoop - 错误:找不到或加载主类org.apache.hadoop.hdfs.server.datanode.DataNode,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50075815/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com