gpt4 book ai didi

apache-spark - Hive on Spark 无法工作 - 无法创建 Spark 客户端

转载 作者:行者123 更新时间:2023-12-01 19:44:51 25 4
gpt4 key购买 nike

作为 Spark 引擎执行 Hive 查询时出现以下错误。

Error:
Failed to execute spark task, with exception org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)'
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.spark.SparkTask

Hive Console:
hive> set hive.execution.engine=spark;
hive> set spark.master=spark://INBBRDSSVM15.example.com:7077;
hive> set spark.executor.memory=2g;

Hadoop - 2.7.0
Hive - 1.2.1
Spark - 1.6.1

最佳答案

The YARN Container Memory was smaller than the Spark Executor requirement. I set the YARN Container memory and maximum to be greater than Spark Executor Memory + Overhead. Check 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'.

Please see Source here

关于apache-spark - Hive on Spark 无法工作 - 无法创建 Spark 客户端,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/38694865/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com