gpt4 book ai didi

exception - HIVE加入问题

转载 作者:行者123 更新时间:2023-12-02 21:56:35 25 4
gpt4 key购买 nike

我在5台计算机上进行了Hive安装(Hive-0.8,Hadoop-1.0.3),每当我尝试连接两个表时,都会出现以下异常:

java.lang.RuntimeException: Error while reading from task log url at org.apache.hadoop.hive.ql.exec.errors.TaskLogProcessor.getErrors(TaskLogProcessor.java:130) at org.apache.hadoop.hive.ql.exec.JobDebugger.showJobFailDebugInfo(JobDebugger.java:211) at org.apache.hadoop.hive.ql.exec.JobDebugger.run(JobDebugger.java:81) at java.lang.Thread.run(Unknown Source) Caused by: java.io.IOException: Server returned HTTP response code: 400 for URL: http:// hadoop4:50060/tasklog?taskid=attempt_201210161122_0013_r_000001_4&start=-8193



网址可以指向其他计算机。

因此,通过谷歌搜索发现,如果我将 taskid更改为 attemptid,那么我可以看到实际的问题,这是此异常:

FATAL ExecReducer: java.lang.IllegalArgumentException: nanos > 999999999 or < 0 at java.sql.Timestamp.setNanos(Unknown Source) at org.apache.hadoop.hive.serde2.io.TimestampWritable.populateTimestamp(TimestampWritable.java:348) at org.apache.hadoop.hive.serde2.io.TimestampWritable.toString(TimestampWritable.java:320) at org.apache.hadoop.hive.serde2.lazy.LazyTimestamp.writeUTF8(LazyTimestamp.java:95) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:232) at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:427) at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serializeField(LazySimpleSerDe.java:381) at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:365) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.processOp(FileSinkOperator.java:569) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762) at org.apache.hadoop.hive.ql.exec.SelectOperator.processOp(SelectOperator.java:84) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762) at org.apache.hadoop.hive.ql.exec.FilterOperator.processOp(FilterOperator.java:132) at org.apache.hadoop.hive.ql.exec.Operator.process(Operator.java:471) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:762) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.genAllOneUniqueJoinObject(CommonJoinOperator.java:749) at org.apache.hadoop.hive.ql.exec.CommonJoinOperator.checkAndGenObject(CommonJoinOperator.java:836) at org.apache.hadoop.hive.ql.exec.JoinOperator.endGroup(JoinOperator.java:263) at org.apache.hadoop.hive.ql.exec.ExecReducer.reduce(ExecReducer.java:198) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:519) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:420) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Unknown Source) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.Child.main(Child.java:249)



我不确定整个堆栈跟踪是否有用,但是从谷歌搜索中发现的异常是指向IBM DB2的IBM更新/修复程序的链接。我不知道这是否与任何Hive问题有关,即使确实如此,我也不知道该怎么办。

谁能指出我正确的方向?

PS:我已经尝试过在线建议用于较旧版本的Hive / Hadoop的解决方案,但没有一个解决方案。另外,我也检查了 NULL值。

最佳答案

您最好提供有关您的 hive 环境的更多信息。例如您的表架构和原始数据。

在上面提供的异常中,问题可能在于您存储在hdfs中的数据与配置单元日期列不匹配。

关于exception - HIVE加入问题,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/12977502/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com