My problem is: sometimes browser over-cached some resources even if i've already modified them. But After F5, everything is fine.
I studied this case whole afternoon. Now i completely understood the point of "Last-Modified" or "Cache-Control". And i know how to solve my issue (just .js?version or explicit max-age=xxxx). But the problem is still unsolved: how does browser handle the response header without "Cache-Control" like this:
Content-Length: 49675
Content-Type: text/html
Last-Modified: Thu, 27 Dec 2012 03:03:50 GMT
Accept-Ranges: bytes
Etag: "0af7fcbdee3cd1:972"
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
Date: Thu, 24 Jan 2013 07:46:16 GMT
They clearly cache them when "Enter in the bar"
Without the cache control header the browser requests the resource every time it loads a new(?) page. Hitting F5 you invalidate (or even logically remove) any cached item within that page forcing the complete reload by acting as no local version is available - I am unsure if the browser removes those resources from cache before requesting them again.
The funny part is that there are some 'additional' settings within some browsers that cause some optimizations like requesting a resource only once per page loading. If you have an image that changes for every request like a counter you will see only one version of this image even if you use it multiple times.
The next one is that the browser reuses images that are not explicitly set as nocache by applying some sort of local 'prefered' caching. If you want to have a request every time you need to set it to revalidate and set expired to -1 or something like that.
So depending on the resource specifying nothing often trigger some defaults that are not the same you would expect from reading the specs.
There might be also different behaviour regarding whether the source appears to be local, a drive or a real distant internet server. Saidly not all browsers are acting differently and I am quite limited.
What helps is to check out www.google.com and look for the tracking pixel their page requests (two 1x1 pixel requested from metrics.gstats.com with random part on the subdomain).
If you use firebug to check out the header you see that they specify the nocache directives in any fashion possible. The header reads like this:
Try this as a setting and check if this solves the issue that the browser did not pick up your changed resources. The must-revalidate directive will cause even proxy caches to request a resource every time and check for 304 Not Modified replies.
I currently experience something similar. I have a localhost connection setting the etag and all that happends is that the cache never ask. I did not set caching information or alike. Alone specifying an etag seams to cause FireFox to not request the resource again. So I experience something similar to your problem.
The default cache-control header is : Private
Please see http://msdn.microsoft.com/en-us/library/ms524721%28v=vs.90%29.aspx
The freshness lifetime is calculated based on several headers. If a "Cache-control: max-age=N" header is specified, then the freshness lifetime is equal to N. If this header is not present, which is very often the case, it is checked if an Expires header is present. If an Expires header exists, then its value minus the value of the Date header determines the freshness lifetime. Finally, if neither header is present, look for a Last-Modified header. If this header is present, then the cache's freshness lifetime is equal to the value of the Date header minus the value of the Last-modified header divided by 10.
https://developer.mozilla.org/zh-CN/docs/Web/HTTP/Caching_FAQ
RFC 7234 details what browsers and proxies should do by default:
Caching is usually enabled by default in browers, so
cache-control
can be used to either customise this behaviour or disable it.The time the browser considers a cached response fresh is usually relative to when it was last modified:
This post has details of how the different browsers calculate that value.