how to configure and run celery worker on remote s

2019-01-22 20:40发布

问题:

i am working on celery and using rabbitmq server and created a project in django project in a server(where message queue,database exists) and it is working fine, i have created multiple workers also

from kombu import Exchange, Queue
CELERY_CONCURRENCY = 8

CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack', 'yaml']

CELERY_RESULT_BACKEND = 'amqp'
CELERYD_HIJACK_ROOT_LOGGER = True
CELERY_HIJACK_ROOT_LOGGER = True
BROKER_URL = 'amqp://guest:guest@localhost:5672//'

CELERY_QUEUES = (
  Queue('default', Exchange('default'), routing_key='default'),
  Queue('q1', Exchange('A'), routing_key='routingKey1'),
  Queue('q2', Exchange('B'), routing_key='routingKey2'),
)
CELERY_ROUTES = {
 'my_taskA': {'queue': 'q1', 'routing_key': 'routingKey1'},
 'my_taskB': {'queue': 'q2', 'routing_key': 'routingKey2'},
}


AMQP_SERVER = "127.0.0.1"
AMQP_PORT = 5672
AMQP_USER = "guest"
AMQP_PASSWORD = "guest"
AMQP_VHOST = "/"`


CELERY_INCLUDE = ('functions')

`

but i want to run workers from another server.so i need some information regarding how to run a worker in another system when i referred few sites it is saying that we need to run the django project on the remote system also is it necessary?

回答1:

Here is the gist of the idea:

On Machine A:

  1. Install Celery & RabbitMQ.
  2. Configure rabbitmq so that Machine B can connect to it.
  3. Create my_tasks.py with some tasks and put some tasks in queue.

On Machine B:

  1. Install Celery.
  2. Copy my_tasks.py file from machine A to this machine.
  3. Run a worker to consume the tasks

I had the same requirement and experimented with celery. It is a lot easier to do that. I wrote a detailed blog post on that few days back. Check out how to send tasks to remote machine?



回答2:

You can make use of app.send_task() with something like the following in your django project:

from celery import Celery
import my_client_config_module

app = Celery()
app.config_from_object(my_client_config_module)

app.send_task('dotted.path.to.function.on.remote.server.relative.to.worker',
              args=(1, 2))


回答3:

First, think about how celery really work?

Celery producer adds a task to queue with name and other important headers to identify the location of your task.

Celery does not add a complete executable function to MQ.

So, When you look at worker(consumer) side.

Celery gets task details from MQ and tries to run this. To run this task there should be available module/files/environment/codebase to execute this task.

Now lets come to your question ...

You try to set worker on a separate machine so logically to execute a function pointed by the task you need complete code environment of tasks and you should connect(Otherwise how you gonna get tasks from MQ ?) with your MQ where tasks live.



回答4:

basically I will take ChillarAnand's answer. I would like to add comment on his answer, but I can't cause I don't have 50 reputation.

so...

the answer to your question...

First you would like to read "how to send tasks to remote machine?", as ChillarAnand mentioned.

That is really good article, with one small flaw, such as "does not have '@app.task' on the function def add(), in the content remote.py", it caused problem and confused me as a newbie to celery.

And the answer to "[Errno 113] No route to host." part,

I guess... I guess you have a firewall running in your rabbitmq server, you might want to have a check. Most of time, it is iptables, but it could something else. Switch it off, or change the rules. Then you can give it another try.