gpt4 book ai didi

hadoop - 以编程方式创建 HFile 并将其加载到 HBase 时,新条目不可用

转载 作者:可可西里 更新时间:2023-11-01 14:20:20 25 4
gpt4 key购买 nike

我正在尝试以编程方式创建 HFile 并将它们加载到正在运行的 HBase 实例中。我在 HFileOutputFormatLoadIncrementalHFiles

中找到了很多信息

我设法创建了新的 HFile,并将其发送到集群。在集群 Web 界面中,新的存储文件出现,但新的键范围不可用。

InputStream stream = ProgrammaticHFileGeneration.class.getResourceAsStream("ga-hourly.txt");
BufferedReader reader = new BufferedReader(new InputStreamReader(stream));
String line = null;

Map<byte[], String> rowValues = new HashMap<byte[], String>();

while((line = reader.readLine())!=null) {
String[] vals = line.split(",");
String row = new StringBuilder(vals[0]).append(".").append(vals[1]).append(".").append(vals[2]).append(".").append(vals[3]).toString();
rowValues.put(row.getBytes(), line);
}

List<byte[]> keys = new ArrayList<byte[]>(rowValues.keySet());
Collections.sort(keys, byteArrComparator);


HBaseTestingUtility testingUtility = new HBaseTestingUtility();
testingUtility.startMiniCluster();

testingUtility.createTable("table".getBytes(), "data".getBytes());

Writer writer = new HFile.Writer(testingUtility.getTestFileSystem(),
new Path("/tmp/hfiles/data/hfile"),
HFile.DEFAULT_BLOCKSIZE, Compression.Algorithm.NONE, KeyValue.KEY_COMPARATOR);

for(byte[] key:keys) {
writer.append(new KeyValue(key, "data".getBytes(), "d".getBytes(), rowValues.get(key).getBytes()));
}

writer.appendFileInfo(StoreFile.BULKLOAD_TIME_KEY, Bytes.toBytes(System.currentTimeMillis()));
writer.appendFileInfo(StoreFile.MAJOR_COMPACTION_KEY, Bytes.toBytes(true));
writer.close();

Configuration conf = testingUtility.getConfiguration();

LoadIncrementalHFiles loadTool = new LoadIncrementalHFiles(conf);
HTable hTable = new HTable(conf, "table".getBytes());

loadTool.doBulkLoad(new Path("/tmp/hfiles"), hTable);

ResultScanner scanner = hTable.getScanner("data".getBytes());
Result next = null;
System.out.println("Scanning");
while((next = scanner.next()) != null) {
System.out.format("%s %s\n", new String(next.getRow()), new String(next.getValue("data".getBytes(), "d".getBytes())));
}

有人真的做了这个工作吗?我在 my github 上有一个可编译/可测试的版本

最佳答案

关于hadoop - 以编程方式创建 HFile 并将其加载到 HBase 时,新条目不可用,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/9665877/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com