How to integrate Storm and Kafka [closed]

2019-02-15 16:27发布

I have worked in Storm and developed a basic program which is using a local text file as input source. But now I have to work on streaming data coming continuously from external systems. For this purpose, Kafka is the best choice.

The problem is how to make my Spout get streaming data from Kafka. Or how to integrate Storm with Kafka. How can I do that so I may process data, coming from Kafka?

1条回答
女痞
2楼-- · 2019-02-15 17:04

Look for KafkaSpout.

It's a normal Storm Spout implementation that reads from a Kafka cluster. All you need is to configure that spout with parameters like list of brokers, topic name, etc. You can simply then chain the output to corresponding bolts for further processing.

From the same doc mentioned above, the configuration goes like this:

SpoutConfig spoutConfig = new SpoutConfig(
      ImmutableList.of("kafkahost1", "kafkahost2"), // List of Kafka brokers
      8, // Number of partitions per host
      "clicks", // Topic to read from
      "/kafkastorm", // The root path in Zookeeper for the spout to store the consumer offsets
      "discovery"); // An id for this consumer for storing the consumer offsets in Zookeeper

KafkaSpout kafkaSpout = new KafkaSpout(spoutConfig);
查看更多
登录 后发表回答