Can a celery worker/server accept tasks from a non

2019-02-14 21:10发布

问题:

I want to use a comet server written using java nio for sending out live updates. When receiving information I want it to scan the data, and send tasks to worker threads via rabbitmq. Ideally I would like a celery server to sit on the other end of rabbit, managing a pool of worker threads that will handle these tasks.

However, from my understanding, celery works by sitting on both ends of rabbitmq, and it essentially takes over the role of producer and consumer by being embedded in both the consumer and producer's code. Is there a way to set up celery as I described above? Thanks

回答1:

It is not necessary to use Celery to publish messages. You can publish messages to RabbitMQ or to other broker from your own app and use Celery to consume tasks.

Celery uses simple message protocol. You can implement the client side in you application.

If you don't want to implement the client side of the protocol you can implement a simple http server which accepts requests and makes appropriate calls. Like this.



回答2:

Yes, of cource !

You can add Custom Message Consumers to a celery app.

Please refer to Extensions and Bootsteps in celery documents.

Here is a part of example code in the link above:

from celery import Celery
from celery import bootsteps
from kombu import Consumer, Exchange, Queue

my_queue = Queue('custom', Exchange('custom'), 'routing_key')

app = Celery(broker='amqp://')


class MyConsumerStep(bootsteps.ConsumerStep):

    def get_consumers(self, channel):
        return [Consumer(channel,
                         queues=[my_queue],
                         callbacks=[self.handle_message],
                         accept=['json'])]

    def handle_message(self, body, message):
        print('Received message: {0!r}'.format(body))
        message.ack()
app.steps['consumer'].add(MyConsumerStep)

Test it:

python -m celery -A main worker

See also: Using Celery with existing RabbitMQ messages