I am using Django==2.0.5
and celery==4.0.2
.
My proj/proj/celery.py
looks like:
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj', include=[])
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
# app.autodiscover_tasks()
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
I was expecting that none of the tasks decorated with shared_task
in tasks.py
of the apps will be discovered but to my surprise, most of the tasks can be seen under [tasks]
when running celery worker
with celery worker -A proj -l INFO
.
My directory structure is somewhat like:
app
│ ├── __init__.py
│ ├── admin.py
│ ├── apps.py
│ ├── constants.py
│ ├── scripts
│ │ ├── __init__.py
│ ├── factories.py
│ ├── migrations
│ │ ├── 0001_initial.py
│ │ ├── __init__.py
│ ├── models.py
│ ├── tasks.py
│ └── tests
│ ├── __init__.py
CELERY_IMPORTS
is not set in settings.py
and I have even tried with CELERY_IMPORTS=()
and CELERY_IMPORTS=['path/to/one/of/the/modules']
, even then all the tasks get discovered.
Any suggestion is welcome.
You should try @app.task without the (bind=True) flag. Also celery can detect all the tasks inside a module if they are in the init file of the module. In the django settings.py
will cause to discover everything in that module. Make sure you haven't done something similar.
Can you also share the project folder structure?