- html - 出于某种原因,IE8 对我的 Sass 文件中继承的 html5 CSS 不友好?
- JMeter 在响应断言中使用 span 标签的问题
- html - 在 :hover and :active? 上具有不同效果的 CSS 动画
- html - 相对于居中的 html 内容固定的 CSS 重复背景?
我正在尝试集成 kafka 消息代理和 Spark,并面临一个问题:线程“main”中的异常 java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka010/LocationStrategies
下面是java Spark代码
package com.test.spell;
import java.util.Arrays;
/**
* Hello world!
*
*/
import java.util.HashMap;
import java.util.HashSet;
import java.util.Iterator;
import java.util.Map;
import java.util.Set;
import java.util.regex.Pattern;
import org.apache.spark.api.java.function.*;
import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.spark.SparkConf;
import org.apache.spark.streaming.Durations;
import org.apache.spark.streaming.api.java.JavaDStream;
import org.apache.spark.streaming.api.java.JavaInputDStream;
import org.apache.spark.streaming.api.java.JavaPairDStream;
import org.apache.spark.streaming.api.java.JavaStreamingContext;
import org.apache.spark.streaming.kafka010.ConsumerStrategies;
import org.apache.spark.streaming.kafka010.KafkaUtils;
import org.apache.spark.streaming.kafka010.LocationStrategies;
import scala.Tuple2;
public class App
{
private static final Pattern SPACE = Pattern.compile(" ");
public static void main( String[] args )
{
String brokers = "localhost:9092";
String topics = "spark-topic";
// Create context with a 2 seconds batch interval
SparkConf sparkConf = new SparkConf().setAppName("JavaDirectKafkaWordCount");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, Durations.seconds(2));
Set<String> topicsSet = new HashSet<>(Arrays.asList(topics.split(",")));
Map<String, Object> kafkaParams = new HashMap<>();
kafkaParams.put("metadata.broker.list", brokers);
// Create direct kafka stream with brokers and topics
JavaInputDStream<ConsumerRecord<String, String>> messages = KafkaUtils.createDirectStream(
jssc,
LocationStrategies.PreferConsistent(),
ConsumerStrategies.Subscribe(topicsSet, kafkaParams));
System.out.println("In programn");
// Get the lines, split them into words, count the words and print
JavaDStream<String> lines = messages.map(new Function<ConsumerRecord<String,String>, String>() {
@Override
public String call(ConsumerRecord<String, String> kafkaRecord) throws Exception {
return kafkaRecord.value();
}
});
JavaDStream<String> words = lines.flatMap(new FlatMapFunction<String, String>() {
@Override
public Iterator<String> call(String line) throws Exception {
System.out.println(line);
return Arrays.asList(line.split(" ")).iterator();
}
});
/* JavaPairDStream<String,Integer> wordCounts = words.mapToPair(new PairFunction<String, String, Integer>() {
@Override
public Tuple2<String, Integer> call(String word) throws Exception {
return new Tuple2<>(word,1);
}
});*/
// wordCounts.print();
// Start the computation
jssc.start();
jssc.awaitTermination();
}
}
下面是我的 pom.xml我尝试了很多 jar 文件版本都找不到合适的版本。
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.spark-project.spark</groupId>
<artifactId>unused</artifactId>
<version>1.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.3</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2-beta</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>0.9.0-incubating</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.10</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.3.1</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.10.0.0</version>
</dependency>
</dependencies>
</project>
我正在按如下方式运行我的 Spark 作业:
./bin/spark-submit --class com.test.spell.spark.App \
--master local \
--driver-memory 4g \
--executor-memory 2g \
--executor-cores 1 \
--queue default \
/home/cwadmin/spark_programs/spell/target/spell-0.0.1-SNAPSHOT.jar
我觉得上述问题是由于错误的jar文件使用而引起的。有人可以帮我解决这个问题吗?我想知道这里应该使用哪些正确的 jar 文件。另外,如果有人分享一些有关这些程序的宝贵资源(例如 Spark 和 Kafka 的集成),我们将不胜感激。
我已经尝试解决这个问题 2 天了,但无法解决
提前致谢。
最佳答案
provided
)。最简单的方法是使用 Maven Assembly 插件。关于java - Kafka Spark Streaming LocationStrategies java类def未找到异常,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/51108376/
我正在尝试为组件编写测试,但我总是收到错误:“错误:./ExpenseOverviewComponent 类 ExpenseOverviewComponent 中的错误 - 内联模板:41:8 由:没
我正在尝试集成 kafka 消息代理和 Spark,并面临一个问题:线程“main”中的异常 java.lang.NoClassDefFoundError: org/apache/spark/stre
我正在关注 Angular2 文档中的简单快速启动应用程序,并使用 spring 后端来运行它。我的问题是 Angular 路由器从 URL 中删除了主题标签,所以应该是 example.com/#/
我们在 Kubernetes 集群上托管了基于 Angular 的 Web 应用程序。此应用程序的入口配置为添加基本 URL: { "kind": "Ingress", "apiVersion
我是一名优秀的程序员,十分优秀!