gpt4 book ai didi

hadoop - 使用 SequenceFile 类写入文件

转载 作者:行者123 更新时间:2023-12-02 21:55:08 26 4
gpt4 key购买 nike

我使用以下代码将一些数据写入 SequenceFile 格式文件。当程序运行一段时间时,我通过 Eclipse 控制台上的红色按钮中断程序。但是,当我检查 hdfs 上的数据文件时,序列文件的大小为零。而且也不能使用'hadoop fs -text filename'命令查看文件。当我使用 SequenceFile.Reader 读取之前创建的文件时,我遇到了 'Exception in thread "main"java.io.EOFException' 异常。在这种情况下,怎么办?我的开发环境是 eclipse3.7(在 windows 7 上)和 hadoop 集群(hadoop 版本 1.0.3 )在 CentOS 6 上。

类序列扩展线程{

private String uri = "hdfs://172.20.11.60:9000";
private String filePath = "/user/hadoop/input/";
private String fileName = "Sequence-01.seq";
public SequenceFile.Writer writer;
private static int cnt = 0;

private void init() {
Configuration conf = new Configuration();
try {
FileSystem fs = FileSystem.get(URI.create(uri), conf);
writer = SequenceFile.createWriter(fs, conf, new Path(filePath
+ fileName), LongWritable.class, Text.class);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}

public Sequence() {
init();
}

@Override
public void run(){
while(true){
try {
writer.append(new LongWritable(100), new Text("hello,world"));
cnt++;
if(cnt%100 == 0){
System.out.println("flush current data to file system");
writer.syncFs();
}
} catch (IOException e) {
// TODO Auto-generated catch block
System.out.println("append data error");
e.printStackTrace();
}

try {
Thread.sleep(1000);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
System.out.println("thread interupted");
e.printStackTrace();
}
}
}

}

公共(public)类 TestSequenceFile {
/**
* @param args
*/
public static void main(String[] args) {
// TODO Auto-generated method stub

new Sequence().start();
}

}

最佳答案

一般建议:不要中断进程。

解决方案:对我来说,以下代码运行良好。

    import java.io.IOException;
import java.net.URI;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.SequenceFile;

import org.apache.hadoop.io.Text;


public class SequenceFileWriteDemo {
private static final String[] DATA = {
"One, two, buckle my shoe",
"Three, four, shut the door",
"Five, six, pick up sticks",
"Seven, eight, lay them straight",
"Nine, ten, a big fat hen"};

public static void main(String[] args) throws IOException {
//String uri = "/home/Desktop/inputSort.txt";
String uri = "hdfs://localhost:9900/out1.seq";

Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
Path path = new Path(uri);
IntWritable key = new IntWritable();
Text value = new Text();
SequenceFile.Writer writer = null;



try {
writer = SequenceFile.createWriter(fs, conf, path,
key.getClass(), value.getClass());


for (int i = 0; i < 130; i++) {
key.set(100 - i);
value.set(DATA[i % DATA.length]);


System.out.printf("[%s]\t%s\t%s\n", writer.getLength(), key, value, key.getClass(), value.getClass());

writer.append(key, value);
}
} finally {
IOUtils.closeStream(writer);
}
}}

有关写入序列文件的详细信息,请参阅 Book Hadoop-The Definitive Guide(O'Reilly 出版物)。

关于hadoop - 使用 SequenceFile 类写入文件,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/15541678/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com