Django Celery tutorial not returning results

2019-02-08 20:20发布

问题:

UDATE3: found the issue. See the answer below.

UPDATE2: It seems I might have been dealing with an automatic naming and relative imports problem by running the djcelery tutorial through the manage.py shell, see below. It is still not working for me, but now I get new log error messages. See below.

UPDATE: I added the log at the bottom of the post. It seems the example task is not registered?

Original Post:

I am trying to get django-celery up and running. I was not able to get through the example.

I installed rabbitmq succesfully and went through the tutorials without trouble: http://www.rabbitmq.com/getstarted.html

I then tried to go through the djcelery tutorial.

When I run python manage.py celeryd -l info I get the message: [Tasks] - app.module.add [2011-07-27 21:17:19, 990: WARNING/MainProcess] celery@sequoia has started.

So that looks good. I put this at the top of my settings file:

import djcelery
djcelery.setup_loader()

BROKER_HOST = "localhost"
BROKER_PORT = 5672
BROKER_USER = "guest"
BROKER_PASSWORD = "guest"
BROKER_VHOST = "/"

added these to my installed apps:

'djcelery',

here is my tasks.py file in the tasks folder of my app:

from celery.task import task

@task()
def add(x, y):
    return x + y

I added this to my django.wsgi file:

os.environ["CELERY_LOADER"] = "django"

Then I entered this at the command line:

>>> from app.module.tasks import add
>>> result = add.delay(4,4)
>>> result
(AsyncResult: 7auathu945gry48- a bunch of stuff)
>>> result.ready()
False

So it looks like it worked, but here is the problem:

>>> result.result
>>>               (nothing is returned)
>>> result.get()

When I put in result.get() it just hangs. What am I doing wrong?

UPDATE: This is what running the logger in the foreground says when I start up the worker server:

No handlers could be found for logger “multiprocessing”

[Configuration]
- broker:      amqplib://guest@localhost:5672/
- loader:      djcelery.loaders.DjangoLoader
- logfile:     [stderr]@INFO
- concurrency: 4
- events:      OFF
- beat:        OFF

[Queues]
- celery:      exchange: celery (direct)  binding: celery

[Tasks]
 - app.module.add
[2011-07-27 21:17:19, 990: WARNING/MainProcess] celery@sequoia has started.

 C:\Python27\lib\site-packages\django-celery-2.2.4-py2.7.egg\djcelery\loaders.py:80:  UserWarning: Using settings.DEBUG leads to a memory leak, neveruse this setting in production environments!
     warnings.warn(“Using settings.DEBUG leads to a memory leak, never”

then when I put in the command:

>>> result = add(4,4)

This appears in the error log:

[2011-07-28 11:00:39, 352: ERROR/MainProcess] Unknown task ignored: Task of kind ‘task.add’ is not registered, please make sure it’s imported. Body->”{‘retries’: 0, ‘task’: ‘tasks.add’, ‘args’: (4,4), ‘expires’: None, ‘ta’: None
    ‘kwargs’: {}, ‘id’: ‘225ec0ad-195e-438b-8905-ce28e7b6ad9’}”
Traceback (most recent call last):
   File “C:\Python27\..\celery\worker\consumer.py”,line 368, in receive_message
      Eventer=self.event_dispatcher)
   File “C:\Python27\..\celery\worker\job.py”,line 306, in from_message 
       **kw)
   File “C:\Python27\..\celery\worker\job.py”,line 275, in __init__
       self.task = tasks[self.task_name]
   File “C:\Python27\...\celery\registry.py”, line 59, in __getitem__
       Raise self.NotRegistered(key)
NotRegistered: ‘tasks.add’   

How do I get this task to be registered and handled properly? thanks.

UPDATE 2:

This link suggested that the not registered error can be due to task name mismatches between client and worker - http://celeryproject.org/docs/userguide/tasks.html#automatic-naming-and-relative-imports

exited the manage.py shell and entered a python shell and entered the following:

>>> from app.module.tasks import add
>>> result = add.delay(4,4)
>>> result.ready()
False
>>> result.result
>>>                 (nothing returned)
>>> result.get()
                    (it just hangs there)

so I am getting the same behavior, but new log message. From the log, it appears the server is working but it won't feed the result back out:

[2011-07-28 11:39:21, 706: INFO/MainProcess] Got task from broker: app.module.tasks.add[7e794740-63c4-42fb-acd5-b9c6fcd545c3]
[2011-07-28 11:39:21, 706: INFO/MainProcess] Task app.module.tasks.add[7e794740-63c4-42fb-acd5-b9c6fcd545c3] succeed in 0.04600000038147s: 8

So the server got the task and it computed the correct answer, but it won't send it back? why not?

回答1:

I found the solution to my problem from another stackoverflow post: Why does Celery work in Python shell, but not in my Django views? (import problem)

I had to add these lines to my settings file:

CELERY_RESULT_BACKEND = "amqp"
CELERY_IMPORTS = ("app.module.tasks", )

then in the task.py file I named the task as such:

@task(name="module.tasks.add")

The server and the client had to be informed of the task names. The celery and django-celery tutorials omit these lines in their tutorials.



回答2:

if you run celery in debug mode is more easy understand the problem

python manage.py celeryd

What the celery logs says, celery is receiving the task ? If not probably there is a problem with broker (wrong queue ?)

Give us more detail, in this way we can help you