My model post processing is using the post_save
signal:
from django.core.signals import request_finished
from django.dispatch import receiver
from models import MyModel
from pipeline import this_takes_forever
@receiver(post_save, sender=MyModel)
def my_callback(sender, **kwargs):
this_takes_forever(sender)
The this_takes_forever
routine does IO so I want to defer it to avoid blocking the request too much.
I thought this was a great use case for the new asyncio module. But I have a hard time getting my mind around the whole process.
I think I should be able to adapt the signal receiver like this:
@receiver(post_save, sender=MyModel)
def my_callback(sender, **kwargs):
loop = asyncio.get_event_loop()
loop.run_until_complete(this_takes_forever(sender))
loop.close()
Provided this_takes_forever
is also adapted to be a coroutine.
@coroutine
def this_takes_forever(instance):
# do something with instance
return instance
This sounds too magical to work. And in fact it halts with an AssertionError
:
AssertionError at /new/
There is no current event loop in thread 'Thread-1'.
I don't see where should I start the loop in this context. Anyone tried something like this?
You get no any benefit in your case:
is equal to
in terms of execution time.
loop.run_until_complete
waits for end ofthis_takes_forever(sender)
coroutine call, so you get synchronous call in second case as well as in former one.About
AssertionError
: you start Django app in multithreaded mode, butasyncio
makes default event loop for main thread only -- you should to register new loop for every user-created thread where you need to call asyncio code.But, say again, asyncio cannot solve your particular problem, it just incompatible with Django.
The standard way for Django is to defer long-running code into celery task (see http://www.celeryproject.org/)