gpt4 book ai didi

java - hadoop 中的 ClassCastException

转载 作者:行者123 更新时间:2023-12-02 21:50:18 25 4
gpt4 key购买 nike

当我启动 may mapreduce 程序时,我收到此错误:

java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.io.BytesWritable
at nflow.hadoop.flow.analyzer.Calcul$Calcul_Mapper.map(Calcul.java:1)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:212)

映射器的代码:
public static class Calcul_Mapper extends Mapper<LongWritable, BytesWritable, Text, Text>{

String delimiter="|";
long interval = 60*60 ;

Calendar cal;

public void map(LongWritable key, BytesWritable value, Context context) throws IOException, InterruptedException {

byte[] value_bytes = value.getBytes();
if(value_bytes.length < FlowWritable.MIN_PKT_SIZE + FlowWritable.PCAP_HLEN) return;


EZBytes eb = new EZBytes(value_bytes.length);
eb.PutBytes(value_bytes, 0, value_bytes.length);

// C2S key ==> protocol | srcIP | dstIP | sPort |dPort
long sys_uptime = Bytes.toLong(eb.GetBytes(FlowWritable.PCAP_ETHER_IP_UDP_HLEN+4,4));
long timestamp = Bytes.toLong(eb.GetBytes(FlowWritable.PCAP_ETHER_IP_UDP_HLEN+8,4))*1000000
+ Bytes.toLong(BinaryUtils.flipBO(eb.GetBytes(FlowWritable.PCAP_ETHER_IP_UDP_HLEN+12, 4),4));


int count = eb.GetShort(FlowWritable.PCAP_ETHER_IP_UDP_HLEN+2);

FlowWritable fw;
byte[] fdata = new byte[FlowWritable.FLOW_LEN];
int cnt_flows = 0;
int pos = FlowWritable.PCAP_ETHER_IP_UDP_HLEN+FlowWritable.CFLOW_HLEN;

try{
while(cnt_flows++ < count){
fw = new FlowWritable();
fdata = eb.GetBytes(pos, FlowWritable.FLOW_LEN);

if(fw.parse(sys_uptime, timestamp, fdata)){
context.write(new Text("Packet"), new Text(Integer.toString(1)));
context.write(new Text("Byte"), new Text(Integer.toString(1)));
context.write(new Text("Flow"), new Text(Integer.toString(1)));
context.write(new Text("srcPort"), new Text(Integer.toString(fw.getSrcport())));
context.write(new Text("dstPort"), new Text(Integer.toString(fw.getDstport())));
context.write(new Text("srcAddr"), new Text(fw.getSrcaddr()));
context.write(new Text("dstAddr"), new Text(fw.getDstaddr()));
}else{

}
pos += FlowWritable.FLOW_LEN;
}
} catch (NumberFormatException e) {
}
}
}

请问有人知道出了什么问题吗?

最佳答案

你能检查一下你的工作配置吗?特别检查那些:

conf.setOutputKeyClass(Something.class);
conf.setOutputValueClass(Something.class);

顺便说一句,因为您的 key 始终固定为常数;您不需要为 map 函数的每个发射创建它们。

而且我认为如果您有一个将所有内容组合在一起的自定义键对象会更好。为此,您需要扩展 对象可写 并实现 可写可比 .

你的写作/发射在我看来非常可疑。

关于java - hadoop 中的 ClassCastException,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/21652278/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com