gpt4 book ai didi

hadoop - 插入Hive Orc分区表时出现运行时异常

转载 作者:行者123 更新时间:2023-12-02 22:02:37 26 4
gpt4 key购买 nike

当我尝试将数据插入到 hive ORC分区表中时,出现以下运行时异常。查询只不过是从文本 hive 分区表中进行的简单插入。

at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:565) at 

org.apache.hadoop.hive.ql.exec.tez.MapRecordSource.processRow(MapRecordSource.java:83) ... 17 more

Caused by: java.lang.NullPointerException at java.lang.System.arraycopy(Native Method) at
org.apache.hadoop.hive.ql.io.orc.DynamicByteArray.add(DynamicByteArray.java:115) at
org.apache.hadoop.hive.ql.io.orc.StringRedBlackTree.addNewKey(StringRedBlackTree.java:48) at
org.apache.hadoop.hive.ql.io.orc.StringRedBlackTree.add(StringRedBlackTree.java:55) at
org.apache.hadoop.hive.ql.io.orc.WriterImpl$StringTreeWriter.write(WriterImpl.java:1218) at
org.apache.hadoop.hive.ql.io.orc.WriterImpl$StructTreeWriter.write(WriterImpl.java:1743) at
org.apache.hadoop.hive.ql.io.orc.WriterImpl.addRow(WriterImpl.java:2412) at
org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.write(OrcOutputFormat.java:86) at
org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:764) at
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:841) at
org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:88) at
org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:841) at
org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:133) at
org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:170)
at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:555) ... 18 more ]],
Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:8,
Vertex vertex_1540158411191_10651_2_00 [Map 1] killed/failed due
to:OWN_TASK_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1
killedVertices:0 (state=08S01,code=2)

提前致谢!

最佳答案

运行所有这些命令,然后尝试立即插入。

**hive.exec.dynamic.partition = true**
**hive.exec.dynamic.partition.mode = strict**
**hive.exec.max.dynamic.partitions.pernode = 100**
**hive.exec.max.dynamic.partitions = 1000**
**hive.exec.max.created.files = 100000**
**hive.error.on.empty.partition = false**

关于hadoop - 插入Hive Orc分区表时出现运行时异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/53165947/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com