I need to deploy a Spark Streaming application on Linux server.
Can anyone provide the steps to how to deploy and what code modification require before deploy?
class JavaKafkaWordCount11 {
public static void main(String[] args) {
StreamingExamples.setStreamingLogLevels();
SparkConf sparkConf = new SparkConf()
.setAppName("JavaKafkaWordCount11")
.setMaster("local[*]");
sparkConf.set("spark.streaming.concurrentJobs", "20");
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new Duration(1500));
Map<String, Integer> topicMap = new HashMap<>();
topicMap.put("TopicQueue", 20);
JavaPairReceiverInputDStream<String, String> messages =
KafkaUtils.createStream(jssc, "x.xx.xxx.xxx:2181", "1", topicMap);
JavaDStream<String> lines = messages.map(new Function<Tuple2<String, String>, String>() {
@Override
public String call(Tuple2<String, String> tuple2) {
return tuple2._2();
}
});
lines.foreachRDD(rdd -> {
if (rdd.count() > 0) {
List<String> strArray = rdd.collect();
getProcessResult(strArray);
}
});
}
}