gpt4 book ai didi

hadoop wordcount并将文件上传到hdfs

转载 作者:可可西里 更新时间:2023-11-01 15:27:46 26 4
gpt4 key购买 nike

大家好,我是 hadoop 的新手,我以伪模式安装 hadoop。配置文件在这里

核心站点.xml

<configuration>

<property>
<name>fs.default.name </name>
<value> hdfs://localhost:9000 </value>
</property>

</configuration>

hdfs-site.xml

<configuration>

<property>
<name>dfs.replication</name>
<value>1</value>
</property>

<property>
<name>dfs.name.dir</name>
<value>file:///home/hadoop_usr/hadoopinfra/hdfs/namenode </value>
</property>

<property>
<name>dfs.data.dir</name>
<value>file:///home/hadoop_usr/hadoopinfra/hdfs/datanode </value>
</property>

</configuration>

并且成功启动datanodenamenode

Now i want to put my file into hdfs by using following way

adding file into hdfs出了什么问题为什么我收到错误消息。请帮我解决这个问题

If i using following way to put file into hdfs that time command is working fine. now i appand hdfs url. update file with hdfs url Please help me why i getting error in first way. Because when in running my wordcount.jar that time am also getting error message when i mentioned data.txt as input file on which operation sould be performed.

提前致谢。

最佳答案

data/data.txt 的第一个 put 操作不起作用的原因可能是您的 hdfs 中还没有文件夹 data。您可以使用 hadoop fs -mkdir/data 创建它。

关于hadoop wordcount并将文件上传到hdfs,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/41288786/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com