gpt4 book ai didi

java - 带有openjdk的Hadoop:start-dfs.sh(SSH?)错误

转载 作者:行者123 更新时间:2023-12-02 20:37:58 24 4
gpt4 key购买 nike

this tutorial之后设置4个集群hadoop体系结构时遇到了问题。我有以下4台计算机(虚拟):

  • 主节点
  • node1
  • node2
  • node3

  • 我在主节点上设置了所有的conf文件,并使用scp将它们导出到其他文件。主节点可以通过ssh访问从节点。我在所有计算机上的.bashrc中设置了JAVA_HOME。但是,这就是我得到的:
    hadoop@master-node:~$ start-dfs.sh
    WARNING: An illegal reflective access operation has occurred
    WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
    WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
    WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
    WARNING: All illegal access operations will be denied in a future release
    Starting namenodes on [node-master]
    node-master: ssh: connect to host node-master port 22: Connection timed out
    node1: Error: JAVA_HOME is not set and could not be found.
    node2: Error: JAVA_HOME is not set and could not be found.
    node3: Error: JAVA_HOME is not set and could not be found.
    Starting secondary namenodes [0.0.0.0]
    hadoop@0.0.0.0's password:
    0.0.0.0: Error: JAVA_HOME is not set and could not be found.
    WARNING: An illegal reflective access operation has occurred
    WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/hadoop/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.4.jar) to method sun.security.krb5.Config.getInstance()
    WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
    WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
    WARNING: All illegal access operations will be denied in a future release

    [3种可能性]尽管我不太确定这是造成混乱的原因,但使用openJDK 11似乎存在问题。错误提示ssh有问题,但我)我上传了我的conf文件,没有任何问题,而且我)我可以从主节点访问所有节点。这可能与设置JAVA_HOME路径的方式有关吗?这是我的.bashrc的结尾:
    export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
    export PATH=PATH:$PATH/bin

    在此先感谢您的每条线索(我使用的Java很少,在这里我感到有些失落)

    [edit]与OracleJDK8相同
    hadoop@master-node:~$  readlink -f /usr/bin/java
    /usr/lib/jvm/java-8-oracle/jre/bin/java
    hadoop@master-node:~$ export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre
    hadoop@master-node:~$ start-dfs.sh
    Starting namenodes on [node-master]
    node-master: ssh: connect to host node-master port 22: Connection timed out
    node1: Error: JAVA_HOME is not set and could not be found.
    node3: Error: JAVA_HOME is not set and could not be found.
    node2: Error: JAVA_HOME is not set and could not be found.
    Starting secondary namenodes [0.0.0.0]
    hadoop@0.0.0.0's password:

    0.0.0.0:错误:未设置JAVA_HOME且找不到。

    最佳答案

    您可以导出路径吗,

    export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64
    export PATH=$PATH:$JAVA_HOME/bin

    然后,您必须执行以下命令,以确保您的PATH包含JAVA_HOME变量。
    在.bashrc文件中附加JAVA和PATH变量后,执行以下命令,
    source ~/.bashrc

    然后检查 echo $PATH
    如果该值包含JAVA_HOME值,则它应该起作用。

    关于java - 带有openjdk的Hadoop:start-dfs.sh(SSH?)错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/50489111/

    24 4 0
    Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
    广告合作:1813099741@qq.com 6ren.com