So I am using Celery with RabbitMQ. I have a RESTful API that registers a user. I am using remote Celery worker to send a registration email asynchronously so my API can return fast response.
from .tasks import send_registration_email
def register_user(user_data):
# save user to the database etc
send_registration_email.delay(user.id)
return {'status': 'success'}
This works fine. Email is being sent in a non blocking asynchronous way (and can be retried if fails which is cool). The problem is when I look at RabbitMQ management console. I can see that the send_registration_email has created a random queue. Something like:
I can see that the task has been successfully executed. So why does the random queue stays in RabbitMQ forever? This is the task payload:
{"status": "SUCCESS", "traceback": null, "result": true, "task_id": "aad10877-3508-4179-a5fb-99f1bd0b8b2f", "children": []}
This normal behaviour, if you have configured CELERY_RESULT_BACKEND in your settings. Please check here: Celery result backend description
You could disable result backend, or decrease each message life time.