gpt4 book ai didi

java - 创建 spark 客户端失败 : Hive on spark exception

转载 作者:可可西里 更新时间:2023-11-01 15:29:21 24 4
gpt4 key购买 nike

我已将 Hive 执行引擎更改为 SPARK。在执行任何 DML/DDL 时,我都会遇到异常。

hive> select count(*) from tablename;
Query ID = jibi_john_20160602153012_6ec1da36-dcb3-4f2f-a855-3b68be118b36
Total jobs = 1
Launching Job 1 out of 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
set mapreduce.job.reduces=<number>
**Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
**FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask****

最佳答案

一个可能的原因是您在 YARN 分配 ApplicationMaster 之前达到超时值。您可以通过设置 hive.spark.client.server.connect.timeout 来延长此超时值

默认值为90000ms。

关于java - 创建 spark 客户端失败 : Hive on spark exception,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/37589062/

24 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com