gpt4 book ai didi

hadoop - HadoopImageProcessingInterface(HIPI)示例程序

转载 作者:行者123 更新时间:2023-12-02 21:34:38 24 4
gpt4 key购买 nike

我正在研究HIPI,并开始使用示例程序。

我无法执行它,因为它总是出现以下异常:

hadoop jar Desktop/edureka/workspace/jars/SampleProgramHIPI.jar hdfs:/video/sampleimages.hib hdfs:/video/sampleimages.output
15/10/16 15:59:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: hipi/imagebundle/mapreduce/ImageBundleInputFormat
at SampleProgram.run(SampleProgram.java:67)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at SampleProgram.main(SampleProgram.java:86)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: hipi.imagebundle.mapreduce.ImageBundleInputFormat
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more

生成路径已使用给定的库(hipi / core / build / libs / hipi-2.1.0.jar)更新。

检查几乎到处!请帮我。

以下是示例程序,我正在尝试执行:
package hipi.image.examples;
import hipi.imagebundle.mapreduce.ImageBundleInputFormat;
import hipi.image.FloatImage;
import hipi.image.ImageHeader;
//import org.hipi.imagebundle.mapreduce.HibInputFormat;
import org.apache.hadoop.conf.Configured;
import org.apache.hadoop.util.Tool;
import org.apache.hadoop.util.ToolRunner;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import java.io.IOException;
@SuppressWarnings("unused")
public class SampleProgram extends Configured implements Tool {
public static class HelloWorldMapper extends Mapper<ImageHeader, FloatImage, IntWritable, FloatImage> {
public void map(ImageHeader key, FloatImage value, Context context)
throws IOException, InterruptedException {
if (value != null && value.getWidth() > 1 && value.getHeight() > 1 && value.getBands() == 3) {
int w = value.getWidth();
int h = value.getHeight();
float[] valData = value.getData();
float[] avgData = {0,0,0};
for (int j = 0; j < h; j++) {
for (int i = 0; i < w; i++) {
avgData[0] += valData[(j*w+i)*3+0];
avgData[1] += valData[(j*w+i)*3+1];
avgData[2] += valData[(j*w+i)*3+2];
}
}
FloatImage avg = new FloatImage(1, 1, 3, avgData);
avg.scale(1.0f/(float)(w*h));
context.write(new IntWritable(1), avg);
}
}
}
public static class HelloWorldReducer extends Reducer<IntWritable, FloatImage, IntWritable, Text> {
public void reduce(IntWritable key, Iterable<FloatImage> values, Context context)
throws IOException, InterruptedException {
FloatImage avg = new FloatImage(1, 1, 3);
int total = 0;
for (FloatImage val : values) {
avg.add(val);
total++;
}
if (total > 0) {
avg.scale(1.0f / total);
float[] avgData = avg.getData();
String result = String.format("Average pixel value: %f %f %f", avgData[0], avgData[1], avgData[2]);
context.write(key, new Text(result));
}
}
}
public int run(String[] args) throws Exception {
if (args.length != 2) {
System.out.println("Usage: helloWorld <input HIB> <output directory>");
System.exit(0);
}
Job job = Job.getInstance();
job.setInputFormatClass(ImageBundleInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);

job.setJarByClass(SampleProgram.class);
job.setMapperClass(HelloWorldMapper.class);
job.setReducerClass(HelloWorldReducer.class);

job.setMapOutputKeyClass(IntWritable.class);
job.setMapOutputValueClass(FloatImage.class);

job.setOutputKeyClass(IntWritable.class);
job.setOutputValueClass(Text.class);

FileInputFormat.setInputPaths(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
boolean success = job.waitForCompletion(true);
return success ? 0 : 1;
}
public static void main(String[] args) throws Exception {
ToolRunner.run(new SampleProgram(), args);
System.exit(0);
}
}

最佳答案

请尝试使用libjars选项,如下所示。 libjars将给定的jar上载到集群,然后使它们在类路径上可用于每个mapper,reducer实例。如果要将其他库(jar)添加到驱动程序客户端类路径,则可以使用HADOOP_CLASSPATH env变量。

export HADOOP_CLASSPATH=hipi-0.0.1.jar
hadoop jar ~/Desktop/edureka/workspace/jars/SampleProgramHIPI.jar hipi.image.examples.SampleProgram hdfs:/video/sampleimages.hib hdfs:/video/sampleimages.output

关于hadoop - HadoopImageProcessingInterface(HIPI)示例程序,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33170087/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com