The code:
for item in pxfile.readlines():
if is_OK(item):
sys.stdout.write(item + "is not OK.")
item = make(item)
item = "#" + item
resfile.write(item)
else:
sys.stdout.write(item)
sys.stdout.write("is OK.")
line = make(item)
resfile.write(item)
If is_OK is true it means that the proxy doesn't exist, should fix that.
def is_OK(ip):
try:
proxy_handler = urllib2.ProxyHandler({'http': ip})
opener = urllib2.build_opener(proxy_handler)
opener.addheaders = [('User-agent', 'Mozilla/5.0')]
urllib2.install_opener(opener)
req=urllib2.Request('http://www.icanhazip.com')
sock=urllib2.urlopen(req)
except urllib2.HTTPError, e:
#print 'Error code: ', e.code
return e.code
except Exception, detail:
#print "ERROR:", detail
return 1
return 0
It takes 10 minutes to get a list like this:
141.219.252.132:68664
is OK.118.174.0.155:8080
is OK.91.194.246.169:8080
is not OK.91.194.246.81:8080
is OK.201.245.110.138:8888
is OK.202.43.178.31:3128
is OK.202.109.80.106:8080
- Is there a way to make it faster?
- It's formatted badly, I tried removing the newline with strip() but no luck.
Any ideas?
You should use threads to make the code run quicker :
First idea, set a shorter timeout than default one
You may also use threading so you can test several connections simultaneously.
And for formatting, using strip() that way should be ok: