I am making several http requests to a particular host using python's urllib2 library. Each time a request is made a new tcp and http connection is created which takes a noticeable amount of time. Is there any way to keep the tcp/http connection alive using urllib2?
相关问题
- Angular RxJS mergeMap types
- how to define constructor for Python's new Nam
- streaming md5sum of contents of a large remote tar
- How to get the background from multiple images by
- Evil ctypes hack in python
If you switch to httplib, you will have finer control over the underlying connection.
For example:
This would send 2 HTTP GETs on the same underlying TCP connection.
If you need something more automatic than plain httplib, this might help, though it's not threadsafe.
I've used the third-party
urllib3
library to good effect in the past. It's designed to complementurllib2
by pooling connections for reuse.Modified example from the wiki: