We're being told that fewer HTTP requests per page load is a Good Thing. The extreme form of that for CSS would be to have a single, unique CSS file per page, with any shared site-wide styles duplicated in each file.
But there's a trade off there. If you have separate shared global CSS files, they can be cached once when the front page is loaded and then re-used on multiple pages, thereby reducing the necessary size of the page-specific CSS files.
So which is better in real-world practice? Shorter CSS files through multiple discrete CSS files that are cacheable, or fewer HTTP requests through fewer-but-larger CSS files?
Browsers will cache the css files what ever its big or small, so i prefer making bigger css files with fewer requests.
But this is not a rule, i just try to do that as i can.
I would go for combining all CSS into one single CSS file. Even if you have some redundant styles that won't apply to all pages, after compressing it with Gzip the size should be small enough. And after the browser has cached it, the size no longer matters. Just insert all CSS into one file. However, you have one problem: the styles change for different pages. You have to take another route. You could do, for example, something like:
And then share the similar styles along with the navigation classes like:
and specify different options in separate styles:
well crap, i think theoretically it depends on the site type. If people are going to hit the pages of the site one time, and one time only and theres a lot of pages and such, then having the broken up css files ends up working out better on a graph. Versus a site or web app where everything is getting hit constantly or often enough anyway, then having specialized/compiled css for each page might have a bigger hit at first (on a graph anyway), but then gets cached and you win in the long run b/c of fewer http requests. And then theres some overlap-age there somewhere in the nether regions of mass css hysteria.
Now get away from the graphs and look at your average site, it probably doesn't matter unless you taking on some really serious traffic. But overall ima vote for the latter here.
Your first port of call is using YSlow or Google Speed to figure out what is going slowest on your site. Sometimes a badly compressed (large) image or two can be slowing the entire thing down. You are told to reduce HTTP requests because each request has a setup cost associated with it but if taken to the extreme can lead to worse performance. In your case having a CSS file for each page is bad form as it means it is harder for browsers to cache.
Taking one method to the extreme is bad practice and you should attempt to approach this problem from a wide angle such as:
If in doubt, look at the source page for the Google home page. They optimise that page heavily and it will give you good clues on what to do.