gpt4 book ai didi

java - Spark Streaming 不对读取 block 执行操作

转载 作者:行者123 更新时间:2023-12-01 18:22:59 25 4
gpt4 key购买 nike

我是 Spark Streaming 概念的新手,最近两天一直试图理解来自套接字的 Spark Streaming。我看到 Spark 能够读取传递到套接字的 block 。但是它不对读取 block 执行任何操作。

这是 Spark 代码

package foo;
import java.io.File;
import java.util.Arrays;
import java.util.LinkedList;
import java.util.List;

import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.function.Function2;
import org.apache.spark.api.java.function.PairFunction;
import org.apache.spark.streaming.api.java.JavaDStream;
import org.apache.spark.streaming.api.java.JavaPairDStream;
import org.apache.spark.streaming.api.java.JavaReceiverInputDStream;
import org.apache.spark.streaming.api.java.JavaStreamingContext;

import scala.Tuple2;

public class AppSocket {

public static void main(String[] args)
{

SparkConf conf = new SparkConf().setAppName("KAFKA").setMaster("local");

JavaStreamingContext jssc = new JavaStreamingContext(conf, new org.apache.spark.streaming.Duration(1000));

JavaReceiverInputDStream<String> inputStream = jssc.socketTextStream("localhost", 33333);


JavaPairDStream<String, Integer> mappedStream = inputStream.mapToPair(
new PairFunction<String, String, Integer>() {

public Tuple2<String, Integer> call(String i) {
System.out.println(i);
return new Tuple2<String, Integer>(i , 1);
}
});

JavaPairDStream<String, Integer> reducedStream = mappedStream.reduceByKey(
new Function2<Integer, Integer, Integer>() {

public Integer call(Integer i1, Integer i2) {
return i1 + i2;
}
});

reducedStream.print();
System.out.println("Testing........"+reducedStream.count());
jssc.start();
jssc.awaitTermination();


}

}

我正在运行 netcat 在指定端口上创建输出流

nc -lk 33333

我尝试创建输出流。这是我的java代码

    ServerSocket serverSocket = null;
int portNumber = 33333;
serverSocket = new ServerSocket(portNumber);

System.out.println("Server Waiting.................");

Socket clientSocket = serverSocket.accept();

System.out.println("Server Connected!!!!!!!!!!!!!!!");


// Wait for a message
int countflag = 0;
PrintWriter out = null;
out = new PrintWriter(clientSocket.getOutputStream(), true);
while(true)
{
Message message = consumer.receive(1000);

if (message instanceof TextMessage) {
TextMessage textMessage = (TextMessage) message;
String text = textMessage.getText();
System.out.println("Received: " + text);
list.add(text);
System.out.println(++countflag);
if(list.size() > 50)
{

for(int i = 0; i < list.size() ; i++)
{
System.out.print(i);
out.write(text);
out.write("\n");
out.flush();
}

list.clear();
}

} else {
count++;

}
if(count > 100) break;
}
out.close();
consumer.close();
session.close();
connection.close();

Spark 消耗在流上发送的 block ,但它不会对流式 block 执行任何所需的操作。

Spark 输出控制台

14/11/26 15:32:14 INFO MemoryStore: ensureFreeSpace(12) called with curMem=3521, maxMem=278302556
14/11/26 15:32:14 INFO MemoryStore: Block input-0-1417015934400 stored as bytes in memory (estimated size 12.0 B, free 265.4 MB)
14/11/26 15:32:14 INFO BlockManagerInfo: Added input-0-1417015934400 in memory on ip-10-0-1-56.ec2.internal:57275 (size: 12.0 B, free: 265.4 MB)
14/11/26 15:32:14 INFO BlockManagerMaster: Updated info of block input-0-1417015934400
14/11/26 15:32:14 WARN BlockManager: Block input-0-1417015934400 already exists on this machine; not re-adding it
14/11/26 15:32:14 INFO BlockGenerator: Pushed block input-0-1417015934400
14/11/26 15:32:15 INFO ReceiverTracker: Stream 0 received 1 blocks
14/11/26 15:32:15 INFO JobScheduler: Added jobs for time 1417015935000 ms

非常感谢您的帮助。提前致谢

最佳答案

将 master 设置为“local[n]”,其中 n > 1 。接收器需要一个任务槽来运行,如果只有一个任务槽,则为“本地”。因此接收器在该槽中运行,没有留下任何任务槽来处理数据。

我建议阅读编程指南以下部分中的“要记住的要点”。 http://spark.apache.org/docs/latest/streaming-programming-guide.html#input-dstreams

关于java - Spark Streaming 不对读取 block 执行操作,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/27153247/

25 4 0
Copyright 2021 - 2024 cfsdn All Rights Reserved 蜀ICP备2022000587号
广告合作:1813099741@qq.com 6ren.com