I am using python 3.5 on Ubuntu 16.
I am trying to use aiohttp to write a simple client.
Here is the code I have. I took it from here. It's the first code sample, with ssl check disabled:
import aiohttp
import asyncio
import async_timeout
async def fetch(session, url):
with async_timeout.timeout(10):
async with session.get(url) as response:
return await response.text()
async def main(loop):
conn = aiohttp.TCPConnector(verify_ssl=False)
async with aiohttp.ClientSession(loop=loop, connector=conn) as session:
html = await fetch(session, 'http://www.google.com')
print(html)
loop = asyncio.get_event_loop()
loop.run_until_complete(main(loop))
loop = asyncio.get_event_loop()
loop.run_until_complete(main(loop))
For some sites, this code works. For others, including http://python.org
or http://google.com
it does not work. Instead, the code generates this error:
aiohttp.errors.ClientOSError: [Errno 101] Cannot connect to host google.com:80 ssl:False [Can not connect to google.com:80 [Network is unreachable]]
I tried a simple requests
script, something like this:
import requests
rsp = requests.get('http://google.com')
print(rsp.text)
This works, I am able to reach google. Both curl and wget also reach google.
Doing some research, I came across a different problem. That problem is similar to my own. I found it here. I tried the solution offered here, but it still does not work.
This issue does not occur for all sites. I came across both http and https sites that worked and did not work.
Any suggestions on why this happens and how can I fix this?
Thank you!
Notes:
Other things I tried.
- Adding my own DNS resolver, also using aiohttp.
- Using the https version of the sites, getting the same error.
- Going to a slightly different url, for example
https://www.google.com/?#q=python