PEP 0492 adds new __await__
magic method. Object that implements this method becomes future-like object and can be awaited using await
. It's clear:
import asyncio
class Waiting:
def __await__(self):
yield from asyncio.sleep(2)
print('ok')
async def main():
await Waiting()
if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
Ok, but what if I want to call some async def
defined function instead of asyncio.sleep
? I can't use await
because __await__
is not async
function, I can't use yield from
because native coroutines requires await
expression:
async def new_sleep():
await asyncio.sleep(2)
class Waiting:
def __await__(self):
yield from new_sleep() # this is TypeError
await new_sleep() # this is SyntaxError
print('ok')
How can I solve it?
Use direct __await__()
call:
async def new_sleep():
await asyncio.sleep(2)
class Waiting:
def __await__(self):
return new_sleep().__await__()
The solution was recommended by Yury Selivanov (the author of PEP 492) for aioodbc library
Short version: await foo
can be replaced by yield from foo.__await__()
Combining all the ideas from the other answers -
in the simplest case, just delegating to another awaitable works:
def __await__(self):
return new_sleep().__await__()
This works because the __await__
method returns an iterator (see PEP 492), so returning another __await__
's iterator is fine.
This means, of course, that we can't change the suspension behavior of the original awaitable at all. The more general approach is to mirror the await
keyword and use yield from
- this lets us combine multiple awaitables' iterators into one:
def __await__(self):
# theoretically possible, but not useful for my example:
#yield from something_else_first().__await__()
yield from new_sleep().__await__()
Here's the catch: this is not doing exactly the same thing as the first variant! yield from
is an expression, so to do exactly the same as before, we need to also return that value:
def __await__(self):
return (yield from new_sleep().__await__())
This directly mirrors how we would write proper delegation using the await
syntax:
return await new_sleep()
extra bit - what's the difference between these two?
def __await__(self):
do_something_synchronously()
return new_sleep().__await__()
def __await__(self):
do_something_synchronously()
return (yield from new_sleep().__await__())
The first variant is a plain function: when you call it, do_...
is executed and an iterator returned. The second is a generator function; calling it doesn't execute any of our code at all! Only when the returned iterator is yielded for the first time will do_...
be executed. This makes a difference in the following, a little contrived situation:
def foo():
tmp = Waiting.__await__()
do_something()
yield from tmp
I didn't understand why I can't yield from native coroutine inside __await__
, but looks like it's possible to yield from generator coroutine inside __await__
and yield from native coroutine inside that generator coroutine. It works:
async def new_sleep():
await asyncio.sleep(2)
class Waiting:
def __await__(self):
@asyncio.coroutine
def wrapper(coro):
return (yield from coro)
return (yield from wrapper(new_sleep()))
To await inside a __await__
function, use the following code:
async def new_sleep():
await asyncio.sleep(1)
class Waiting:
def __await__(self):
yield from new_sleep().__await__()
print('first sleep')
yield from new_sleep().__await__()
print('second sleep')
return 'done'
You can also simplify Mihail's version to this:
async def new_sleep():
await asyncio.sleep(2)
class Waiting:
def __await__(self):
async def wrapper():
await new_sleep()
print("OK")
return wrapper()
Use a decorator.
def chain__await__(f):
return lambda *args, **kwargs: f(*args, **kwargs).__await__()
Then write __await__
as a native coroutine.
async def new_sleep():
await asyncio.sleep(2)
class Waiting:
@chain__await__
async def __await__(self):
return await new_sleep()