Since Python 3.5 introduced async with
the syntax recommended in the docs for aiohttp
has changed. Now to get a single url they suggest:
import aiohttp
import asyncio
async def fetch(session, url):
with aiohttp.Timeout(10):
async with session.get(url) as response:
return await response.text()
if __name__ == '__main__':
loop = asyncio.get_event_loop()
with aiohttp.ClientSession(loop=loop) as session:
html = loop.run_until_complete(
fetch(session, 'http://python.org'))
print(html)
How can I modify this to fetch a collection of urls instead of just one url?
In the old asyncio
examples you would set up a list of tasks such as
tasks = [
fetch(session, 'http://cnn.com'),
fetch(session, 'http://google.com'),
fetch(session, 'http://twitter.com')
]
I tried to combine a list like this with the approach above but failed.