I don't find get the practical difference between Cache-Control:no-store
and Cache-Control:no-cache
.
As far as I know, no-store
means that no cache device is allowed to cache that response. In the other hand, no-cache
means that no cache device is allowed to serve a cached response without validate it first with the source. But what is that validation about? Conditional get?
What if a response has no-cache
, but it has no Last-Modified
or ETag
?
Regards.
As you identified, no-cache doesn't mean there is never caching, but rather that the user agent has to always ask the server if it's OK to use what it cached. By contrast, no-store says to not even keep a copy, which means there's nothing to ask about. If you know the answer to "Can I reuse this?" is always no, you get a performance boost by skipping cache validation and saving room in the cache for other data.
Aside from performance, there is a behavior difference with browser history. HTTP 1.1 section 13.13 says that "expiration time does not apply to history mechanisms." The no-cache header describes expiration, and so doesn't apply to history mechanisms such as the back button. Thus, the user can navigate backward to a previous page with no-cache without the server being contacted.
The no-store header, on the other hand, prevents the data from being stored outside of a session, in which case it simply isn't available for a history mechanism to use. With no-store, if the user ends his session by navigating to another domain and then goes back, the only way for browser to know what to display is to get the initial page again from the server.
Here's how a Chromium issue on this topic makes the distinction:
Exactly checking
Last-Modified
orETag
. Client would ask server if it has new version of data using those headers and if the answer is no it will serve cached data.Update
From RFC
See the below flow chart for better understanding
Ref: (https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/http-caching?hl=en#cache-control)