gpt4 book ai didi

hadoop - Apache Flume自定义拦截器-二进制和奇怪的HDFS文件

转载 作者:行者123 更新时间:2023-12-02 21:33:43 33 4
gpt4 key购买 nike

我对水槽拦截器概念还比较陌生,面临一个问题:在应用拦截器之前,下沉的文件是普通的文本文件,而在应用拦截器之后,一切都变得非常糟糕。

我的拦截器代码如下-

package com.flume;

import org.apache.flume.*;
import org.apache.flume.interceptor.*;

import java.util.List;
import java.util.Map;
import java.util.ArrayList;
import java.io.UnsupportedEncodingException;
import java.net.InetAddress;
import java.net.UnknownHostException;

public class CustomHostInterceptor implements Interceptor {

private String hostValue;
private String hostHeader;

public CustomHostInterceptor(String hostHeader){
this.hostHeader = hostHeader;
}

@Override
public void initialize() {
// At interceptor start up
try {
hostValue =
InetAddress.getLocalHost().getHostName();
} catch (UnknownHostException e) {
throw new FlumeException("Cannot get Hostname", e);
}
}

@Override
public Event intercept(Event event) {

// This is the event's body
String body = new String(event.getBody());
if(body.toLowerCase().contains("text")){
try {
event.setBody("hadoop".getBytes("UTF-8"));
} catch (UnsupportedEncodingException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
// These are the event's headers
Map<String, String> headers = event.getHeaders();

// Enrich header with hostname
headers.put(hostHeader, hostValue);

// Let the enriched event go
return event;
}

@Override
public List<Event> intercept(List<Event> events) {

List<Event> interceptedEvents =
new ArrayList<Event>(events.size());
for (Event event : events) {
// Intercept any event
Event interceptedEvent = intercept(event);
interceptedEvents.add(interceptedEvent);
}

return interceptedEvents;
}

@Override
public void close() {
// At interceptor shutdown
}

public static class Builder
implements Interceptor.Builder {

private String hostHeader;

@Override
public void configure(Context context) {
// Retrieve property from flume conf
hostHeader = context.getString("hostHeader");
}

@Override
public Interceptor build() {
return new CustomHostInterceptor(hostHeader);
}
}
}

Flume conf是-
agent.sources=exec-source
agent.sinks=hdfs-sink
agent.channels=ch1

agent.sources.exec-source.type=exec
agent.sources.exec-source.command=tail -F /home/cloudera/Desktop/app.log
agent.sources.exec-source.interceptors = i1
agent.sources.exec-source.interceptors.i1.type = com.flume.CustomHostInterceptor$Builder
agent.sources.exec-source.interceptors.i1.hostHeader = hostname

agent.sinks.hdfs-sink.type=hdfs
agent.sinks.hdfs-sink.hdfs.path= hdfs://localhost:8020/bosch/flume/applogs
agent.sinks.hdfs-sink.hdfs.filePrefix=logs
agent.sinks.hdfs-sink.hdfs.rollInterval=60
agent.sinks.hdfs-sink.hdfs.rollSize=0

agent.channels.ch1.type=memory
agent.channels.ch1.capacity=1000

agent.sources.exec-source.channels=ch1
agent.sinks.hdfs-sink.channel=ch1

对在HDFS中创建的文件进行处理的过程-
SEQ!org.apache.hadoop.io.LongWritable"org.apache.hadoop.io.BytesWritable���*q�CJv�/ESmP�ź
some textP�żc
some more textP���K
textP��ߌangels and deamonsP��%�
text bla blaP��1�angels and deamonsP��1�
testP��1�hmmmP��1�anything

有什么建议么?

谢谢

最佳答案

拦截器看起来没有问题。

在您的Flume Agent配置中。

您未指定此属性(hdfs.fileType),因此将其作为默认SequenceFile

尝试将此行添加到您的HDFS SINK中,让我知道是否可行。

agent.sinks.hdfs-sink.hdfs.fileType=DataStream 

关于hadoop - Apache Flume自定义拦截器-二进制和奇怪的HDFS文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33421863/

33 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com