gpt4 book ai didi

java - 线程 "main"java.lang.NoClassDefFoundError : org/apache/hadoop/tracing/SpanReceiverHost 中的异常

转载 作者:可可西里 更新时间:2023-11-01 15:26:13 25 4
gpt4 key购买 nike

我正在运行 Hadoop 2.8.1 和 Hive 2.3.0我正在尝试从 Hive 中创建的表中读取值当前的异常是

java.lang.ClassNotFoundException: org.apache.hadoop.tracing.SpanReceiverHost
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

这是我用来读取表格的代码

 public static final String HIVEURL = "jdbc:hive2://localhost:10000";
public static final String DB_NAME = "default";
public static final String TABLE_NAME = "order_line";

public static void main(String[] args) throws Exception {
HiveConf hiveConf = new HiveConf();
//hiveConf.setVar(HiveConf.ConfVars.METASTOREURIS, HIVEURL);
HiveMetaStoreClient hiveClient = new HiveMetaStoreClient(hiveConf);

Job job =Job.getInstance();
TaskAttemptContext ctx = new TaskAttemptContextImpl(job.getConfiguration(), new TaskAttemptID());
HCatInputFormat hcif = HCatInputFormat.setInput(job, DB_NAME, TABLE_NAME);


HCatSchema allCols = hcif.getTableSchema(job.getConfiguration());
List<HCatFieldSchema> usedList = new ArrayList<>();
usedList.add(allCols.get(2)); // por ex...
HCatSchema someCols = new HCatSchema(usedList);
hcif.setOutputSchema(job, someCols);

for(InputSplit split: hcif.getSplits(job)) {
RecordReader<WritableComparable, HCatRecord> rr = hcif.createRecordReader(split,ctx);
rr.initialize(split, ctx);

while(rr.nextKeyValue()) {
HCatRecord record = rr.getCurrentValue();
// usar record.get(...) para obter a coluna...
//Object o = record.get(1);
//System.out.println(o.toString());

}

rr.close();
}

hiveClient.close();
}

这是我用过的Pom文件

org.apache.hive.h目录 配置单元目录核心 2.3.0 org.apache.hive.h目录 hive 目录 0.13.1-cdh5.3.5 组织.apache.hive hive 常见 2.3.0

    <dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive</artifactId>
<version>0.13.1-cdh5.3.5</version>
</dependency>

<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-metastore</artifactId>
<version>2.3.0</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.1</version>
</dependency>

<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>2.6.0-mr1-cdh5.12.1</version>
<type>pom</type>
</dependency>

<dependency>
<groupId>org.apache.thrift</groupId>
<artifactId>libthrift</artifactId>
<version>0.9.3</version>
</dependency>
</dependencies>

最佳答案

我无法从堆栈跟踪代码段中看出是什么导致了 loadClass,但似乎该类实际上并不存在于 hadoop 版本 2.8.1 中-common 你正在使用。 2.7.2之后好像消失了

它或同名的东西在 hbase source

您有混搭的版本吗?

关于java - 线程 "main"java.lang.NoClassDefFoundError : org/apache/hadoop/tracing/SpanReceiverHost 中的异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/46664843/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com