celery how to implement single queue with multiple

2019-07-23 16:11发布

I am currently running celery 4.0.2 with a single worker like this:

celery.py:

app = Celery('project',
         broker='amqp://jimmy:jimmy123@localhost/jimmy_vhost',
         backend='rpc://',
         include=['project.tasks'])

if __name__ == '__main__':
    app.start()
    app.name

tasks.py:

from .celery import app
from celery.schedules import schedule
from time import sleep, strftime

app.conf.beat_schedule = {
    'planner_1': {
        'task': 'project.tasks.call_orders',
        'schedule': 1800,
    },
    'planner_2': {
        'task': 'project.tasks.call_inventory',
        'schedule': 900,
    },
}

I used the following command to run with beat:

 celery -A project worker -l info --concurrency=3 --beat -E

Right now it is only a single queue with only one worker running.

My question is how to run celery with multiple workers and single queue so that tasks are executed in parallel using multiprocessing without duplication?

I looked up on the internet, how to run celery with multiprocessing. According to this article:

celery worker -l info -P processes -c 16 will result in a single message consumer delegating work to 16 OS-level pool processes. Each OS-level process can be assigned to different CPU in a multicore environment, and as such it will process tasks in parallel, but it will not consume messages in parallel.

can using the -p processes argument solve my problem? Also but what is meant by, "it will process tasks in parallel, but it will not consume messages in parallel"?

0条回答
登录 后发表回答