PyKafka producer.get_delivery_report throwing Queu

2019-05-17 16:38发布

问题:

I am currently working on a Kafka integration using Python, and I am new to Kafka and Python coming from a PHP background.

I have managed to get the producer working however it is not processing each message fast enough due to waiting for ack from Kafka.

On the GitHub page (https://github.com/Parsely/pykafka) there is the following example that should process messages asynchronously and still allow for delivery reports:

>>> with topic.get_producer(delivery_reports=True) as producer:
...     count = 0
...     while True:
...         count += 1
...         producer.produce('test msg', partition_key='{}'.format(count))
...         if count % 10**5 == 0:  # adjust this or bring lots of RAM ;)
...             while True:
...                 try:
...                     msg, exc = producer.get_delivery_report(block=False)
...                     if exc is not None:
...                         print 'Failed to deliver msg {}: {}'.format(
...                             msg.partition_key, repr(exc))
...                     else:
...                         print 'Successfully delivered msg {}'.format(
...                         msg.partition_key)
...                 except Queue.Empty:
...                     break

I have modified the example, however from testing I can see that the first message is sent successfully, but a Queue.empty exception is thrown.

This is my modified code:

from pykafka import KafkaClient
import Queue
import json

client = KafkaClient(hosts='1.1.1.1:9092')
topic = client.topics['test']


sync = False
# sync = True

if sync:
    with topic.get_sync_producer() as producer:
        count = 0
        while True:
            count += 1
            producer.produce('Test message ' + str(count))
            print 'Sent message ' + str(count)
else:
    with topic.get_producer(delivery_reports=True) as producer:
        count = 0
        while True:
            count += 1
            if count >= 100:
                print 'Processed 100 messages'
                break
            producer.produce('Test message ' + str(count))
            while True:
                try:
                    msg, exc = producer.get_delivery_report(block=False)
                    if exc is not None:
                        print 'Failed to deliver msg {}: {}'.format(msg.offset, repr(exc))
                    else:
                        print 'Successfully delivered msg {}'.format(msg.offset)
                except Queue.Empty:
                    print 'Queue.empty'
                    break

And the output:

/Users/jim/Projects/kafka_test/env/bin/python /Users/jim/Projects/kafka_test/producer.py
Queue.empty
...
... x100
Processed 100 messages

From checking my consumer I can see that all 100 messages have been sent successfully, but I am unable to tell this from my producer.

Do you have any suggestions on how I can improve this implementation, more specifically how I can increase my throughput while keeping the ability to check the message was successful?

回答1:

I found a GitHub issue related to this: https://github.com/Parsely/pykafka/issues/291

I fixed this by lowering the min_queued_messages to 1.

with topic.get_sync_producer(min_queued_messages=1) as producer:
        count = 0
        while True:
            count += 1
            time_start = time.time()
            producer.produce('Test message ' + str(count))
            time_end = time.time()

            print 'Sent message %d, %ss duration' % (count, (time_end - time_start))