I'm banging my head to the wall with celeryd and RabbitMQ.
This example from tutorial is working just fine:
from celery import Celery
app = Celery('tasks', backend='amqp', broker='amqp://')
@app.task
def add(x, y):
return x + y
I run:
celery -A tasks worker --loglevel=info
And I get the output:
[2014-11-18 19:47:58,874: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2014-11-18 19:47:58,881: INFO/MainProcess] mingle: searching for neighbors
[2014-11-18 19:47:59,889: INFO/MainProcess] mingle: all alone
[2014-11-18 19:47:59,896: WARNING/MainProcess] celery@vagrant-ubuntu-trusty-64 ready.
I can run the task now from python repl and get a result.
But when I install celeryd, the process hangs up on the mingle-step:
[2014-11-18 20:18:33,893: INFO/MainProcess] mingle: searching for neighbors
No output after this appears.
My /etc/default/celeryd looks like this:
ENABLED="true"
CELERYD_NODES="w1"
CELERYD_CHDIR="/home/myusername/src/celery-test"
CELERYD_OPTS="--time-limit=300 --concurrency=8"
CELERY_CONFIG_MODULE="celeryconfig"
CELERYD_LOG_FILE="/var/log/celery/%n.log"
CELERYD_USER="celery"
CELERYD_GROUP="celery"
I'm using these versions:
Ubuntu 14.04 celery 3.1.16 (Cipater) celeryd 3.1.6-1ubuntu1 rabbitmq-server 3.2.4-1 Python 2.7.6
So without daemonizing, celery can initialize itself, but with daemon (celeryd) hangs on the mingle-step apparently.
Some forums suggest that this is a problem with RabbitMQ reaching a disk space limit. I have plenty of disk, and RabbitMQ's own logs do not indicate any problem.
I got a hunch from this message:
So for some reason I had librabbitmq1 apt package and it was too old.
And it turns out that problem was librabbitmq1 package. I removed it with:
And the python Celery/RabbitMQ libs did some kind of fallback to some other (maybe plain-python?) implementation which works!
Check your free disk space. RabbitMQ requires 1Gb of free space by default.
If you use the Database backend, adding the following options to celery should solve the problem: