Just a short, simple one about the excellent Requests module for Python.
I can't seem to find in the documentation what the variable 'proxies' should contain. When I send it a dict with a standard "IP:PORT" value it rejected it asking for 2 values. So, I guess (because this doesn't seem to be covered in the docs) that the first value is the ip and the second the port?
The docs mention this only:
proxies – (optional) Dictionary mapping protocol to the URL of the proxy.
So I tried this... what should I be doing?
proxy = { ip: port}
and should I convert these to some type before putting them in the dict?
r = requests.get(url,headers=headers,proxies=proxy)
here is my basic class in python for the requests module with some proxy configs and stopwatch !
I have found that urllib has some really good code to pick up the system's proxy settings and they happen to be in the correct form to use directly. You can use this like:
It works really well and urllib knows about getting Mac OS X and Windows settings as well.
The accepted answer was a good start for me, but I kept getting the following error:
Fix to this was to specify the http:// in the proxy url thus:
I'd be interested as to why the original works for some people but not me.
Edit: I see the main answer is now updated to reflect this :)
You can refer to the proxy documentation here.
If you need to use a proxy, you can configure individual requests with the proxies argument to any request method:
To use HTTP Basic Auth with your proxy, use the http://user:password@host.com/ syntax:
The
proxies
' dict syntax is{"protocol":"ip:port", ...}
. With it you can specify different (or the same) proxie(s) for requests using http, https, and ftp protocols:Deduced from the
requests
documentation:On linux you can also do this via the
HTTP_PROXY
,HTTPS_PROXY
, andFTP_PROXY
environment variables:On Windows:
Thanks, Jay for pointing this out:
The syntax changed with requests 2.0.0.
You'll need to add a schema to the url: http://docs.python-requests.org/en/latest/user/advanced/#proxies
It’s a bit late but here is a wrapper class that simplifies scraping proxies and then making an http POST or GET:
ProxyRequests