gpt4 book ai didi

hadoop - hive-hbase 集成错误

转载 作者:可可西里 更新时间:2023-11-01 16:14:28 26 4
gpt4 key购买 nike

我使用的是 hive 版本 0.12.0 ,hadoop 版本 2.4.0 和 hbase 版本 0.98.3

我在 hbase 中创建了一个表信息,用一行数据填充它,为 hive 提供访问权限(使用外部表)

运行查询时

  select count (*) from info;

我明白了

Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapred.reduce.tasks=<number>
Starting Job = job_1405407486295_0003, Tracking URL = http://prashasti-Vostro-2520:8088 /proxy/application_1405407486295_0003/
Kill Command = /home/prashasti/Installed/hadoop/bin/hadoop job -kill job_1405407486295_0003
Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 1
2014-07-15 12:55:14,643 Stage-1 map = 0%, reduce = 0%
2014-07-15 12:55:39,914 Stage-1 map = 100%, reduce = 100%
Ended Job = job_1405407486295_0003 with errors
Error during job, obtaining debugging information...
Examining task ID: task_1405407486295_0003_m_000000 (and more) from job job_1405407486295_0003

Task with the most failures(4):
-----
Task ID:
task_1405407486295_0003_m_000000

URL:
http://0.0.0.0:8088/taskdetails.jsp?jobid=job_1405407486295_0003& tipid=task_1405407486295_0003_m_000000
-----
Diagnostic Messages for this Task:
Error: java.lang.NullPointerException
at org.apache.hadoop.hbase.mapreduce.TableRecordReaderImpl.close(TableRecordReaderImpl.java:161)
at org.apache.hadoop.hbase.mapreduce.TableRecordReader.close(TableRecordReader.java:80)
at org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat$1.close(HiveHBaseTableInputFormat.java:198)
at org.apache.hadoop.hive.ql.io.HiveRecordReader.doClose(HiveRecordReader.java:50)
at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.close(HiveContextAwareRecordReader.java:96)
at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.close(MapTask.java:209)
at org.apache.hadoop.mapred.MapTask.closeQuietly(MapTask.java:1950)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:445)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)

我已经尝试过以下1) 我创建了一个目录 hive/auxlib 并添加了 protobuf-.jar, zookeeper-.jar , hbase-client-.jar, hbase-server-.jar , hbase -common-.jar、hbase-protobuf-.jar、hive-hbase-handler-.jar、guava-.jar 来自 hbase/lib

最佳答案

这似乎是一个已知问题。 https://issues.apache.org/jira/browse/HIVE-4520

关于hadoop - hive-hbase 集成错误,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/24752423/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com