I am currently a placement student (web developer) working for a university, and I have been assigned a few big web projects. The projects include a total revamp of the university i.t. help site which draws around 14k hits a month from on the uni campus and around 4k externally. I also have a second project which is a mobile version of the first project. The projects will be sharing some resources.
To generalise this question so the answers could be useful to more people:
- I have two websites that will share some resources, lets say, index.php, functions.js and style.css, and these scripts will be used on almost all pages on the websites.
- I have two audiences to cater for (in terms of download speed), the users within the same network that the sites hosted on (100mb/s aprx) and external users.
I would like to know what would be the best way to cache each kind of script (.js, .css, .php) and examples of how this would be done with their pros and cons over other methods if possible. By caching I mean locally, network and server caching.
Note: index.php is a dynamic page which should be refreshed from cache every 2 hours. It would be cool if you start your answer with .js, .css, .php or a combination so I can easily see what type of script you are talking about caching.
Thanks All!
Well caching is such a broad spectrum you really should be a bit more specific.
For example, if you're looking to lower the load on the server, you would want to cache the PHP files using APC (For example) [lowers disk reads of files].. or use memcache/redis/some other in-memory key-value store for relieving stress from your database server (application level caching).
If we're talking about the static files, there are a number of things you could do to gain network speed:
Make sure the caching headers returned from the server are correct and that those files are cached in the client (for as long as you need/want). (clients get more responsive sites, you get less server load - but you'll still get hits on which you'll return a 304 not modified)
If you're using Apache+mod_php... apache will start a php interpeter even for requests that are for static content (css, js). While if you place nginx before, those could be cache by the http server itself - much faster, Alternatively, go to step 3 (below)
You could put Varnish infront of your entire(/both) websites for static content/semi-static content.
Another common "micro"-optimization... this usually effects bigger sites.. nothing i would worry about with your ~20K.. but if you want, is move the static files to a differnet domain such as some-university-static.com (not a subdomain).. that way cookies headers aren't sent with the static file's request, resulting in less incoming bandwidth and fast response for the user. (smaller request sent-fast it gets to the destination-faster it returns)
Home this gave you some initial pointers to look into.
Ken.
Performance tuning through cachine could be categorized into multi-layers:
Good introduction and practical code examples can be found in Chapter 9 (Performance) - Developing Large Web Applications. It will talk about Caching CSS, Javascript, Modules, Pages, Ajax and Expire headers.
If we need to keep things simple on server-side do the following:
In future if you have multiple servers and the performance becomes to be crucial before then you will need to look at:
Finally as references in web application performance check:
For the .js and .css files, you can simply use
expires
HTTP headers, which will cause browsers to cache them.As for the .php, there are several options.
You can use memcache for specific things, for example if you're loading the same list of users over and over again from the database, and you cache the result with a specific expire time, e.g. 2 hours.
Or you can use reverse proxy such as varnish to to cache away a whole static html page, generated from the .php script