gpt4 book ai didi

java - NoSuchMethodError : org. apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy

转载 作者:可可西里 更新时间:2023-11-01 14:46:28 26 4
gpt4 key购买 nike

以前我在 单节点集群 上通过 java 在 hdfs 中创建目录,它运行得很顺利,但是一旦我创建了多节点集群,我就得到了这个错误我得到的堆栈跟踪看起来像这样

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;ZLjava/lang/String;Ljava/lang/String;Ljava/lang/Class;)Lorg/apache/hadoop/io/retry/RetryPolicy;
at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:410)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:316)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at CreateDirectory.main(CreateDirectory.java:44)

这是 CreateDirectory 类

public static void main(String[] args) throws SQLException, ClassNotFoundException {
String hdfsUri = "hdfs://localhost:9000/";
//String dirName = args[0];
String dirName=null;
// String filename = args[1];
String filename;

if(args.length<=0) dirName = "ekbana"; filename = "text.csv";

URL url = null;
BufferedReader in = null;
FileSystem hdfs = null;
FSDataOutputStream outStream = null;
HttpURLConnection conn = null;
List<Map<String, String>> flatJson;
Configuration con = new Configuration();
try {
url = new URL("http://crm.bigmart.com.np:81/export/export-sales-data.php?sdate=2016-12-01&edate=2016-12-02&key=jdhcvuicx8ruqe9djskjf90ueddishr0uy8v9hbjncvuw0er8idsnv");
} catch (MalformedURLException ex) {
}

try {
con.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName());
con.set("fs.file.impl", org.apache.hadoop.fs.LocalFileSystem.class.getName());
hdfs = FileSystem.get(URI.create(hdfsUri), con); // this is line 44
} catch (IOException e) {
e.printStackTrace();
}

try {
System.out.println(hdfs.mkdirs(new Path(hdfsUri + "/" + dirName)));
} catch (IOException e) {
e.printStackTrace();
}

许多网站上的解决方案说我需要 hadoop-common,我已经有了它,但我仍然收到此错误。我怀疑重试策略是否与我的设置相关,如果不是,那么为什么会出现此错误?

最佳答案

添加 maven 依赖有帮助:

```
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.8.1</version>
</dependency>

```

关于java - NoSuchMethodError : org. apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44923580/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com