Should I aim for fewer HTTP requests or more cache

2019-09-16 11:07发布

We're being told that fewer HTTP requests per page load is a Good Thing. The extreme form of that for CSS would be to have a single, unique CSS file per page, with any shared site-wide styles duplicated in each file.

But there's a trade off there. If you have separate shared global CSS files, they can be cached once when the front page is loaded and then re-used on multiple pages, thereby reducing the necessary size of the page-specific CSS files.

So which is better in real-world practice? Shorter CSS files through multiple discrete CSS files that are cacheable, or fewer HTTP requests through fewer-but-larger CSS files?

4条回答
三岁会撩人
2楼-- · 2019-09-16 11:40

Browsers will cache the css files what ever its big or small, so i prefer making bigger css files with fewer requests.

But this is not a rule, i just try to do that as i can.

查看更多
对你真心纯属浪费
3楼-- · 2019-09-16 11:45

I would go for combining all CSS into one single CSS file. Even if you have some redundant styles that won't apply to all pages, after compressing it with Gzip the size should be small enough. And after the browser has cached it, the size no longer matters. Just insert all CSS into one file. However, you have one problem: the styles change for different pages. You have to take another route. You could do, for example, something like:

index.html
<div class="navigation_index"></div>

about.html
<div class="navigation_about"></div>

And then share the similar styles along with the navigation classes like:

.navigation_about, .navigation_index {
color: #000;
}

and specify different options in separate styles:

.navigation_about {
font: sans-serif,
}
.navigation_index {
font: serif,
}
查看更多
一纸荒年 Trace。
4楼-- · 2019-09-16 11:58

well crap, i think theoretically it depends on the site type. If people are going to hit the pages of the site one time, and one time only and theres a lot of pages and such, then having the broken up css files ends up working out better on a graph. Versus a site or web app where everything is getting hit constantly or often enough anyway, then having specialized/compiled css for each page might have a bigger hit at first (on a graph anyway), but then gets cached and you win in the long run b/c of fewer http requests. And then theres some overlap-age there somewhere in the nether regions of mass css hysteria.

Now get away from the graphs and look at your average site, it probably doesn't matter unless you taking on some really serious traffic. But overall ima vote for the latter here.

查看更多
做自己的国王
5楼-- · 2019-09-16 12:00

Your first port of call is using YSlow or Google Speed to figure out what is going slowest on your site. Sometimes a badly compressed (large) image or two can be slowing the entire thing down. You are told to reduce HTTP requests because each request has a setup cost associated with it but if taken to the extreme can lead to worse performance. In your case having a CSS file for each page is bad form as it means it is harder for browsers to cache.

Taking one method to the extreme is bad practice and you should attempt to approach this problem from a wide angle such as:

  • Properly compress images or use CSS sprites (reduces HTTP requests)
  • Implement proper web caching using Expres, ETag etc (so clients don't have to rerequest everything)
  • Optimise your CSS and Javascript files using YUI or another similar tool
  • Improve your CSS / javascript code for performance. Certain CSS selectors can lead to the browser taking longer to render a page
  • Replace images with pure CSS where possible i.e. background colors etc.
  • Use GZip compression on any text output i.e. html, css, js

If in doubt, look at the source page for the Google home page. They optimise that page heavily and it will give you good clues on what to do.

查看更多
登录 后发表回答