gpt4 book ai didi

ssh - 如何通过 java 的 hdfs 协议(protocol)访问 hadoop?

转载 作者:可可西里 更新时间:2023-11-01 14:35:54 24 4
gpt4 key购买 nike

我找到了一种通过 hftp 连接到 hadoop 的方法,它工作正常,(只读):

uri = "hftp://172.16.xxx.xxx:50070/";

System.out.println( "uri: " + uri );
Configuration conf = new Configuration();

FileSystem fs = FileSystem.get( URI.create( uri ), conf );
fs.printStatistics();

但是,我想读/写以及复制文件,也就是说,我想通过 hdfs 连接。如何启用 hdfs 连接以便我可以编辑实际的远程文件系统?

我试图从 hftp -> hdfs 更改上面的协议(protocol),但出现以下异常 ...

(请原谅我对 url 协议(protocol)和 hadoop 的了解不足,我认为这是我问的一个有点奇怪的问题,但我们将不胜感激任何帮助!)

Exception in thread "main" java.io.IOException: Call to /172.16.112.131:50070 failed on local exception: java.io.EOFException at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139) at org.apache.hadoop.ipc.Client.call(Client.java:1107) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226) at $Proxy0.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384) at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:213) at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:180) at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228) at sb.HadoopRemote.main(HadoopRemote.java:24)

最佳答案

把你要打的hadoop的core-site.xmlhdfs-site.xml加到conf里就行了像这样:

import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.testng.annotations.Test;

/**
* @author karan
*
*/
public class HadoopPushTester {

@Test
public void run() throws Exception {

Configuration conf = new Configuration();

conf.addResource(new Path("src/test/resources/HadoopConfs/core-site.xml"));
conf.addResource(new Path("src/test/resources/HadoopConfs/hdfs-site.xml"));

String dirName = "hdfs://hosthdfs:port/user/testJava";

// Values of hosthdfs:port can be found in the core-site.xml in the fs.default.name
FileSystem fileSystem = FileSystem.get(conf);


Path path = new Path(dirName);
if (fileSystem.exists(path)) {
System.out.println("Dir " + dirName + " already exists");
return;
}

// Create directories
fileSystem.mkdirs(path);

fileSystem.close();
}
}

关于ssh - 如何通过 java 的 hdfs 协议(protocol)访问 hadoop?,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/7844458/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com