Timeout for python requests.get entire response

2020-01-24 01:46发布

I'm gathering statistics on a list of websites and I'm using requests for it for simplicity. Here is my code:

data=[]
websites=['http://google.com', 'http://bbc.co.uk']
for w in websites:
    r= requests.get(w, verify=False)
    data.append( (r.url, len(r.content), r.elapsed.total_seconds(), str([(l.status_code, l.url) for l in r.history]), str(r.headers.items()), str(r.cookies.items())) )

Now, I want requests.get to timeout after 10 seconds so the loop doesn't get stuck.

This question has been of interest before too but none of the answers are clean. I will be putting some bounty on this to get a nice answer.

I hear that maybe not using requests is a good idea but then how should I get the nice things requests offer. (the ones in the tuple)

19条回答
\"骚年 ilove
2楼-- · 2020-01-24 02:49

pardon me but I am wondering why nobody suggested the following simpler solution? :-o

## request
requests.get('www.mypage.com', timeout=20)
查看更多
登录 后发表回答