What's a good way to survive abnormally high traffic spikes?
My thought is that at some trigger, my website should temporarily switch into a "low bandwidth" mode: switch to basic HTML pages, minimal graphics, disable widgets that might put unnecessary load on the database, and so-on.
My thoughts are:
- Monitor CPU usage
- Monitor bandwidth
- Monitor requests / minute
Edit: I am familiar with options like caching, switching to static content or a content delivery network, and so on as a means to survive, so perhaps the question should focus more on how one detects when the website is about to become overloaded. (Although answers on other survival methods are of course still more than welcome.) Lets say that the website is running Apache on Linux and PHP. This is probably the most common configuration and should allow the maximum number of people to gain assistance from the answers. Lets also assume that expensive options like buying another server and load balancing are unavailable - for most of us at least, a mention on Slashdot is going to be a once-in-a-lifetime occurrence, and not something we can spend money preparing for.
Here's a rather lengthy but highly informative article about surviving "flash crowds".
Here's their scenario for the situation their proposed solutions address:
The article then proposed a number of steps the garage innovator can take, such as using storage delivery networks and implementing highly-scalable databases.
nearlyfreespeech.net is a semi-cloud so to speak and helps a ton in situations like this. As others above mentioned, layered caching helps a lot. Pull chunks of information from memcached instead of the database, have a reverse proxy (or a distributed reverse proxy aka CDN, Panther Networks is cheap) in front of you.
Auto-redirect to Coral CDN, unless the request is from coral cdn.
.htaccess:
I rewrite all URLs referred by several popular sites to be redirected through the coralCDN.
An example for Apache:
Don't write content or provide a service that may appeal to geeks ;)