gpt4 book ai didi

java - Spark 提交失败,出现 java.lang.NoSuchMethodError : scala. Predef$.$conforms()Lscala/Predef$$less$colon$less;

转载 作者:搜寻专家 更新时间:2023-10-30 21:21:15 26 4
gpt4 key购买 nike

我正在使用 spark 1.3.1 预构建版本 spark-1.3.1-bin-hadoop2.6.tgz

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1418) at org.apache.spark.SparkConf.(SparkConf.scala:58) at org.apache.spark.SparkConf.(SparkConf.scala:52) at com.zoho.zbi.Testing.test(Testing.java:43) at com.zoho.zbi.Testing.main(Testing.java:39) Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

我正在尝试一个简单的演示应用程序来保存到 cassandra

SparkConf batchConf= new SparkConf()
.setSparkHome(sparkHome)
.setJars(jars)
.setAppName(ZohoBIConstants.getAppName("cassandra"))//NO I18N
.setMaster(master).set("spark.cassandra.connection.host", "localhost");

JavaSparkContext sc = new JavaSparkContext(batchConf);
// here we are going to save some data to Cassandra...
List<Person> people = Arrays.asList(
Person.newInstance(1, "John", new Date()),
Person.newInstance(2, "Anna", new Date()),
Person.newInstance(3, "Andrew", new Date())
);
// Person test = Person.newInstance(1, "vini", new Date())''
System.out.println("Inside Java API Demo : "+people);
JavaRDD<Person> rdd = sc.parallelize(people);
System.out.println("Inside Java API Demo rdd : "+rdd);
javaFunctions(rdd).writerBuilder("test", "people", mapToRow(Person.class)).saveToCassandra();
System.out.println("Stopping sc");
sc.stop();

当我使用 spark 提交时提交它的工作

bin/spark-submit --class "abc.efg.Testing" --master spark://xyz:7077 /home/test/target/uber-Cassandra-0.0.1-SNAPSHOT.jar

这是我的pom

依赖

<dependencies>
<!-- Scala -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
<scope>compile</scope>
</dependency>

<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>${scala.version}</version>
</dependency>
<!-- END Scala -->
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
</dependency>

<dependency>
<groupId>com.yammer.metrics</groupId>
<artifactId>metrics-core</artifactId>
<version>2.2.0</version>
</dependency>

<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>

<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.1.0</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>com.datastax.cassandra</groupId>
<artifactId>cassandra-driver-core</artifactId>
<version>2.1.5</version>
</dependency>

<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20090211</version>
</dependency>
<!-- Cassandra Spark Connector dependency -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.2.0</version>
</dependency>
<!-- Cassandra java Connector dependency -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.2.0</version>
</dependency>

<!-- Spark Core dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.3.1</version>
</dependency>
<!-- Spark dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.3.1</version>
</dependency>
<!-- Spark dependency -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.3.1</version>
</dependency>
</dependencies>

我用

<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<finalName>uber-${project.artifactId}-${project.version}</finalName>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>

</plugins>
</build>

但是当我通过代码提交时它不起作用,非常感谢任何帮助。我尝试在 pom 中添加 scala2.10.4 prop 仍然没有成功

我在 eclipse 中作为应用程序运行,所有 master、spark home 和 jars 都设置为 sparkConf,错误在 sparkConf 中准确显示

我的scala版本是

scala-版本Scala 代码运行程序版本 2.11.2 -- 版权所有 2002-2013,LAMP/EPFL

这与问题有什么关系吗?

如何切换到旧版本的 scala?在文档中它说 spark1.3.1 支持 scala 2.10.x 版本,请让我知道如何解决这个问题

最佳答案

您遇到的问题是由于 Scala 版本不兼容造成的。 Prebuild Spark 1.3.1 发行版是使用较旧的 Scala 2.10 编译的,因为某些 Spark 依赖项在 2.11 下不受支持,包括 JDBC 支持。

我建议使用 Scala 2.10 运行您的 Spark 集群。但是,如果您愿意,也可以通过以下方式使用 Scala 2.11 编译您的 Spark 包:

dev/change-version-to-2.11.sh
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

关于java - Spark 提交失败,出现 java.lang.NoSuchMethodError : scala. Predef$.$conforms()Lscala/Predef$$less$colon$less;,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/30342273/

26 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com