gpt4 book ai didi

java - Hadoop 接受驱动程序的自定义参数

转载 作者:可可西里 更新时间:2023-11-01 16:56:25 26 4
gpt4 key购买 nike

如何将自定义参数传递给我的 hadoop mapreduce 作业?

例如,如果在我的驱动程序中我有:

  public static void main(String[] args) throws Exception {
try {
String one = args[0];
String two = args[1];
System.out.println(two);
System.out.println(one);
}
catch (ArrayIndexOutOfBoundsException e){
System.out.println("ArrayIndexOutOfBoundsException caught");
}
finally {

}
Configuration conf = new Configuration();
Job job = Job.getInstance(conf, "word count");
job.setJarByClass(WordCount.class);
job.setMapperClass(TokenizerMapper.class);
job.setCombinerClass(IntSumReducer.class);
job.setReducerClass(IntSumReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(IntWritable.class);
FileInputFormat.addInputPath(job, new Path(args[2]));
FileOutputFormat.setOutputPath(job, new Path(args[3]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}

将文件压缩后,当我运行命令时:

hadoop jar str1 str2 /home/bli1/wordcount/wc.jar /user/bli1/wordcount/input /user/bli1/wordcount/testout

我得到:

Not a valid JAR: /nfsdata/DSCluster/home/bli1/wordcount/str1

最佳答案

参数需要放在 JAR 文件引用之后,例如:

hadoop jar /home/bli1/wordcount/wc.jar str1 str2 /user/bli1/wordcount/input /user/bli1/wordcount/testout

关于java - Hadoop 接受驱动程序的自定义参数,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/29290580/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com