Confluent Kafka 5.0.0 has been installed on AWS EC2 which has Public IP say 54.XX.XX.XX Opened port 9092 on the EC2 machine with 0.0.0.0
In /etc/kafka/server.properties I have
advertised.listeners=PLAINTEXT://54.XX.XX.XX:9092
listeners=PLAINTEXT://0.0.0.0:9092
In /etc/kafka/producer.properties
I have bootstrap.servers=0.0.0.0:9092
on local machine
In /etc/kafka/consumer.properties
I have bootstrap.servers=54.XX.XX.XX:9092
In the EC2, started kafka 'confluent start'
and created 'mytopic'
My producer.py code running from local machine looks like (relavant portion):
from confluent_kafka import Producer
broker = '54.XX.XX.XX'
topic = 'mytopic'
p = Producer({'bootstrap.servers': broker})
for data in dictList:
p.poll(0)
sendme = json.dumps(data)
p.produce(topic, sendme.encode('utf-8'), callback=delivery_report)
p.flush()
This seems to write messages to 'mytopic' in the kafka stream in the EC2. I can see those messages in 'kafkacat -b 54.XX.XX.XX -t mytopic' on the EC2.
But I am not able to access those message from local machine as a simple message printing consumer, with code as below:
from confluent_kafka import Consumer, KafkaError, KafkaException
import json
import sys
broker = '54.XX.XX.XX'
topic = 'mytopic'
group = 'mygroup'
c = Consumer({
'bootstrap.servers': broker,
'group.id': group,
'session.timeout.ms': 6000,
'default.topic.config': {
'auto.offset.reset': 'smallest'
}
})
basic_consume_loop(c,[topic])
def basic_consume_loop(consumer, topics):
try:
consumer.subscribe(topics)
while running:
msg = consumer.poll(timeout=1.0)
if msg is None: continue
if msg.error():
if msg.error().code() == KafkaError._PARTITION_EOF:
# End of partition event
sys.stderr.write('{} [{}] reached end at offset {}\n'.format(msg.topic(), msg.partition(), msg.offset()))
data_process()
elif msg.error():
raise KafkaException(msg.error())
else:
msg_process(msg)
finally:
# Close down consumer to commit final offsets.
print("Shutting down the consumer")
consumer.close()
It just hangs, did I miss any settings?
Following steps seem to work.
On both, local and EC2 machine, in /etc/kakfa/server.properties set
On local machine, in /etc/kakfa/producer.properties set
On EC2 machine, in /etc/kakfa/producer.properties set
On both local and EC2 machine, in /etc/kakfa/consumer.properties set
Use 'confluent-start' to start all necessary daemons on remote EC2 machine. On local machine, Confluent is NOT made running.
On local machine (for ip hiding, optional):
With this, producer from local machine,can put messages on remote EC2 Kafka by following:
From local machine, messages could be fetched from remote EC2 kafka by following:
These steps seems to work. There could be some redundancies, if so, do point out.