gpt4 book ai didi

java - Java 中的 Spark Hive Context hql 问题 - 在 yarn 中运行 spark 作业时

转载 作者:搜寻专家 更新时间:2023-11-01 03:21:53 25 4
gpt4 key购买 nike

我们有一个在 yarn 中使用 spark-submit 运行的 spark 应用程序。当运行一个

java 中的 sparkHiveContext.hql("show databases")

得到以下异常

ClassLoaderResolver for class "" gave error on creation : {1} org.datanucleus.exceptions.NucleusUserException: ClassLoaderResolver for class "" gave error on creation : {1}
at org.datanucleus.NucleusContext.getClassLoaderResolver(NucleusContext.java:1087)
at org.datanucleus.PersistenceConfiguration.validatePropertyValue(PersistenceConfiguration.java:797)
at org.datanucleus.PersistenceConfiguration.setProperty(PersistenceConfiguration.java:714)
at org.datanucleus.PersistenceConfiguration.setPersistenceProperties(PersistenceConfiguration.java:693)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:273)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:247)
at org.datanucleus.NucleusContext.<init>(NucleusContext.java:225)

我得到堆栈跟踪

Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 27 more caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:408)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 32 more Caused by: javax.jdo.JDOFatalInternalException: Unexpected exception caught.

但是..在 spark-sql 控制台中运行我的查询是可行的。这有什么问题。

最佳答案

这样做的原因是,如果您将 spark 应用程序打包为 fat/uber jar,则 datanucleus 库不喜欢它,因为每个 datanucleus 依赖 jar 中都有一个公共(public) pom.xml,它会被一个覆盖。您需要将它们分别添加到类路径中。

关于java - Java 中的 Spark Hive Context hql 问题 - 在 yarn 中运行 spark 作业时,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/28326311/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com