gpt4 book ai didi

apache-spark - 在 Spark 上配置单元 : java. lang.NoClassDefFoundError: org/apache/hive/spark/client/Job

转载 作者:行者123 更新时间:2023-12-02 03:21:12 27 4
gpt4 key购买 nike

当我在 Debug模式下在 Hive 控制台上运行查询时,我收到如下所示的错误。我正在使用 hive-1.2.1 和 spark 1.5.1;我检查了 hive-exec jar,它有类定义 org/apache/hive/spark/client/Job

Caused by: java.lang.NoClassDefFoundError: org/apache/hive/spark/client/Job
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:792)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:99)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:96)
at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:327)
... 15 more*

最后查询失败:

"ERROR spark.SparkTask: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'"*

我该如何解决这个问题?

最佳答案

在hive-1.2.1 pom.xml中,spark.version是1.3.1

所以,简单的方法是从 spark.apache.org 下载 spark-1.3.1-bin-hadoop。

然后,将它的路径添加到 hive-site.xml,例如:

<property>
<name>spark.home</name>
<value>/path/spark-1.3.1-bin-hadoop2.4</value>
</property>

关于apache-spark - 在 Spark 上配置单元 : java. lang.NoClassDefFoundError: org/apache/hive/spark/client/Job,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/33233431/

27 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com