I am using python to get datas from redis and then parser it to kafka. It works well in most situations.
But when I use python to simulate build the data to redis, or there was a fast put-in datas in queuen, I can't get all the data.
Here is my code about redis producer to simulate build 20000 datas to redis:
rc = redis.Redis(host='127.0.0.1', port=6379)
rc.ping()
ps = rc.pubsub()
ps.subscribe('bdwaf')
r_str = "--8198b507-A--\n[22/Jun/2017:14:13:19 +0800]ucTcxcMcicAcAcAcicAcAcAm 192.168.1.189 50054 127.0.0.1 80\n"
for i in range(0, 20000):
rc.publish('bdwaf', r_str)
and the redis consumer also the kafka producer is:
rc = redis.Redis(host='localhost', port=6379)
rc.ping()
ps = rc.pubsub()
ps.subscribe('bdwaf')
num = 0
for item in ps.listen():
if item['type'] == 'message':
num += 1
a.parser(item['data'])
print num
It only prints out like 4000 datas.
If I comment the a.parser(item['data'])
, it can print out all datas num.
Or wirte sleep(0.001)
in the redis producer, it can print out all datas num too.
What is wrong with my code?