I want to to test my application's handling of timeouts when grabbing data via urllib2, and I want to have some way to force the request to timeout.
Short of finding a very very slow internet connection, what method can I use?
I seem to remember an interesting application/suite for simulating these sorts of things. Maybe someone knows the link?
If you want to set the timeout for each request you can use the timeout argument for urlopen
You could set the default timeout as shown above, but you could use a mix of both since Python 2.6 in there is a timeout option in the urlopen method:
The default timeout for urllib2 is infinite, and importing socket ensures you that you'll catch the timeout as socket.timeout exception
I usually use netcat to listen on port 80 of my local machine:
Then I use http://localhost/ as the request URL in my application. Netcat will answer at the http port but won't ever give a response, so the request is guaranteed to time out provided that you have specified a timeout in your
urllib2.urlopen()
call or by callingsocket.setdefaulttimeout()
.why not write a very simple CGI script in bash that just sleeps for the required timeout period?
If you're running on a Mac, speedlimit is very cool.
There's also dummynet. It's a lot more hardcore, but it also lets you do some vastly more interesting things. Here's a pre-configured VM image.
If you're running on a Linux box already, there's netem.
I believe I've heard of a Windows-based tool called TrafficShaper, but that one I haven't verified.