gpt4 book ai didi

java.lang.ClassCastException : org. apache.hadoop.io.LongWritable 无法转换为 org.apache.hadoop.hbase.io.ImmutableBytesWritable

转载 作者:可可西里 更新时间:2023-11-01 14:23:26 25 4
gpt4 key购买 nike

我是 Hadoop 的新手。我使用 hadoop 2.3.0 和 hbase 0.98.3 。我正在尝试使用 MapReduce 从文本文件中提取数据并写入 hadoop 中的 hbase 表。虽然我设置了作业的 outputKeyClass 和 outputValueClass,但我得到了 classCastException。谁能帮帮我?

这是我的代码。

public static void main(String[] args) {
Configuration config = HBaseConfiguration.create();
Job job;
try {
job = new Job(config, "LogBulkLoader");
job.setJarByClass(Main.class);

job.setMapperClass(LogMapper.class);

job.setOutputFormatClass(TableOutputFormat.class);
job.getConfiguration().set(TableOutputFormat.OUTPUT_TABLE, "fatih");

job.setOutputKeyClass(ImmutableBytesWritable.class);
job.setOutputValueClass(Put.class);

FileInputFormat.addInputPath(job, new Path(userActionsTestFile));
job.setNumReduceTasks(0);
job.waitForCompletion(true);

} catch (IOException e) {
e.printStackTrace();
} catch (ClassNotFoundException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
}

public static class LogMapper extends
TableMapper<ImmutableBytesWritable, Put> {

@Override
protected void setup(Context context) throws IOException,
InterruptedException {
}

@Override
protected void map(ImmutableBytesWritable key, Result value,
Context context) throws IOException, InterruptedException {
try {
String[] l = value.toString().split(",");

String[] t = l[4].split(" ");
String[] date = t[0].split("-");
String[] time = t[1].split(":");

GregorianCalendar gc = new GregorianCalendar(
Integer.parseInt(date[0]), Integer.parseInt(date[1]),
Integer.parseInt(date[2]), Integer.parseInt(time[0]),
Integer.parseInt(time[1]), Integer.parseInt(time[2]));

Put put = new Put(Bytes.toBytes(l[0]));

put.add(Bytes.toBytes("song"), Bytes.toBytes(l[1]),
gc.getTimeInMillis(), Bytes.toBytes(l[6]));

put.add(Bytes.toBytes("album"), Bytes.toBytes(l[1]),
gc.getTimeInMillis(), Bytes.toBytes(l[5]));
put.add(Bytes.toBytes("album"), Bytes.toBytes(l[2]),
gc.getTimeInMillis(), Bytes.toBytes(l[5]));

put.add(Bytes.toBytes("singer"), Bytes.toBytes(l[1]),
gc.getTimeInMillis(), Bytes.toBytes(l[5]));
put.add(Bytes.toBytes("singer"), Bytes.toBytes(l[2]),
gc.getTimeInMillis(), Bytes.toBytes(l[5]));
put.add(Bytes.toBytes("singer"), Bytes.toBytes(l[3]),
gc.getTimeInMillis(), Bytes.toBytes(l[5]));

context.write(new ImmutableBytesWritable(l[0].getBytes()), put);
} catch (Exception e) {
e.printStackTrace();
}
}
}

我得到以下异常。

java.lang.Exception: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hbase.io.ImmutableBytesWritable
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:403)
Caused by: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable cannot be cast to org.apache.hadoop.hbase.io.ImmutableBytesWritable
at com.argedor.module1.Main$LogMapper.map(Main.java:1)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:235)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)

最佳答案

在代码中添加以下内容

  job.setMapOutputKeyClass(ImmutableBytesWritable.class)
job.setMapOutputValueClass(Put.class)

到作业配置。

关于java.lang.ClassCastException : org. apache.hadoop.io.LongWritable 无法转换为 org.apache.hadoop.hbase.io.ImmutableBytesWritable,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24387984/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com