gpt4 book ai didi

java - 从mongoDB复制数据到hdfs时hadoop jar错误

转载 作者:行者123 更新时间:2023-12-02 20:56:40 24 4
gpt4 key购买 nike

我正在尝试使用HadoopMongodb连接器将收集从mongodb复制到hadoop
使用以下代码:
包hdfs;

import java.io.*;
import org.apache.commons.logging.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.lib.output.*;
import org.apache.hadoop.mapreduce.*;
import org.bson.*;
import com.mongodb.hadoop.*;
import com.mongodb.hadoop.util.*;

public class ImportWeblogsFromMongo {
private static final Log log = LogFactory.getLog(ImportWeblogsFromMongo.class);

public static class ReadWeblogsFromMongo extends Mapper<Object, BSONObject, Text, Text> {
public void map(Object key, BSONObject value, Context context) throws IOException, InterruptedException {
System.out.println("Key: " + key);
System.out.println("Value: " + value);
String md5 = value.get("md5").toString();
String url = value.get("url").toString();
String date = value.get("date").toString();
String time = value.get("time").toString();
String ip = value.get("ip").toString();
String output = "\t" + url + "\t" + date + "\t" + time + "\t" + ip;
context.write(new Text(md5), new Text(output));
}
}

public static void main(String[] args) throws Exception {
final Configuration conf = new Configuration();
MongoConfigUtil.setInputURI(conf,"mongodb://localhost:27017/clusterdb.fish");
MongoConfigUtil.setCreateInputSplits(conf, false);
System.out.println("Configuration: " + conf);
@SuppressWarnings("deprecation")
final Job job = new Job(conf, "Mongo Import");
Path out = new Path("/home/mongo_import");
FileOutputFormat.setOutputPath(job, out);
job.setJarByClass(ImportWeblogsFromMongo.class);
job.setMapperClass(ReadWeblogsFromMongo.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setInputFormatClass(MongoInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
job.setNumReduceTasks(0);
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}

1.导出名为 importmongo.jar的Jar文件后
2.我尝试执行此命令 hadoop jar /home/yass/importmongo.jar hdfs.ImportWeblogsFromMongo,但出现以下错误:
Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/hadoop/util/MongoConfigUtil
at hdfs.ImportWeblogsFromMongo.main(ImportWeblogsFromMongo.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.mongodb.hadoop.util.MongoConfigUtil
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more

注意::clustedb是数据库名称,并且收集它的集合
和hdfs.ImportWeblogsFromMongo是package.class

有什么建议吗

最佳答案

我没有通过这种方式解决问题,但通过将文件复制到Mongodump找到了使用Hdfs的解决方案,它下面的几行可能会帮助某人完成工作

   mongodump  --db clusterdb --collection CollectionName

bsondump file.bson > file.json

hadoop dfs -copyFromLocal /path/to/file/fish.json mongo

关于java - 从mongoDB复制数据到hdfs时hadoop jar错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/44314282/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com