gpt4 book ai didi

scala - 线程 "main"java.lang.NoClassDefFoundError : org/apache/spark/sql/catalyst/analysis/OverrideFunctionRegistry 中的异常

转载 作者:行者123 更新时间:2023-12-01 11:28:57 25 4
gpt4 key购买 nike

我已经尝试在 spark 和 scala 中使用以下代码,附加代码和 pom.xml

package com.Spark.ConnectToHadoop

import org.apache.spark.SparkConf
import org.apache.spark.SparkConf
import org.apache.spark._
import org.apache.spark.sql._
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.SQLContext
import org.apache.spark.rdd.RDD
//import groovy.sql.Sql.CreateStatementCommand

//import org.apache.spark.SparkConf


object CountWords {

def main(args:Array[String]){

val objConf = new SparkConf().setAppName("Spark Connection").setMaster("spark://IP:7077")
var sc = new SparkContext(objConf)
val objHiveContext = new HiveContext(sc)
objHiveContext.sql("USE test")


var test= objHiveContext.sql("show tables")
var i = 0

var testing = test.collect()
for(i<-0 until testing.length){

println(testing(i))
}
}
}

我已经添加了 spark-core_2.10、spark-catalyst_2.10、spark-sql_2.10、spark-hive_2.10 依赖项我还需要添加任何依赖项吗???

编辑:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>

<groupId>com.Sudhir.Maven1</groupId>
<artifactId>SparkDemo</artifactId>
<version>IntervalMeterData1</version>
<packaging>jar</packaging>

<name>SparkDemo</name>
<url>http://maven.apache.org</url>

<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<spark.version>1.5.2</spark.version>
</properties>

<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>

最佳答案

看起来你忘了碰撞 spark-hive:

    <dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.5.2</version>
</dependency>

考虑引入 maven 变量,如 spark.version。

   <properties>
<spark.version>1.5.2</spark.version>
</properties>

并以这种方式修改所有 spark 依赖项:

   <dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>${spark.version}</version>
</dependency>

提高 spark 版本不会那么痛苦。

只需添加属性 spark.version在你的<properties>不够,你必须用 ${spark.version} 调用它在依赖项中。

关于scala - 线程 "main"java.lang.NoClassDefFoundError : org/apache/spark/sql/catalyst/analysis/OverrideFunctionRegistry 中的异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/34871015/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com