I'm writing an application which needs to run a series of tasks in parallel and then a single task with the results of all the tasks run:
@celery.task
def power(value, expo):
return value ** expo
@celery.task
def amass(values):
print str(values)
It's a very contrived and oversimplified example, but hopefully the point comes across well. Basically, I have many items which need to run through power
, but I only want to run amass
on the results from all of the tasks. All of this should happen asynchronously, and I don't need anything back from the amass
method.
Does anyone know how to set this up in celery so that everything is executed asynchronously and a single callback with a list of the results is called after all is said and done?
I've setup this example to run with a chord
as Alexander Afanasiev recommended:
from time import sleep
import random
tasks = []
for i in xrange(10):
tasks.append(power.s((i, 2)))
sleep(random.randint(10, 1000) / 1000.0) # sleep for 10-1000ms
callback = amass.s()
r = chord(tasks)(callback)
Unfortunately, in the above example, all tasks in tasks
are started only when the chord
method is called. Is there a way that each task can start separately and then I could add a callback to the group to run when everything has finished?
Taking a look at this snippet from your question, it looks like you are passing a
list
as the chord header, rather than agroup
:Converting the
list
to agroup
should result in the behaviour you're expecting:The answer that @alexander-afanasiev gave you is essentially right: use a chord.
Your code is OK, but
tasks.append(power.s((i, 2)))
is not actually executing the subtask, just adding subtasks to a list. It'schord(...)(...)
the one that send as many messages to the broker as subtasks you have defined intasks
list, plus one more message for the callback subtask. When you callchord
it returns as soon as it can.If you want to know when the chord has finished you can poll for completion like with a single task using
r.ready()
in your sample.Celery has plenty of tools for most of workflows you can imagine.
It seems you need to get use of chord. Here's a quote from docs:
Here's a solution which worked for my purposes:
tasks.py:
Use Case:
What this should do is start all of the tasks as soon as possible asynchronously. Once they've all been posted to the queue, the
amass
task will also be posted to the queue. The amass task will keep reposting itself until all of the other tasks have been completed.