I've set up action caching (with sweepers, but I guess that's irrelevant here) in my app, and so far it works great except for one thing:
I use Kaminari for pagination, and thus when I execute expire_action
on my action it only expires the first page. As I know caching won't work when using the query string for specifying the page, I've set up a route so the pages are appended to the end of the url (for example /people/123/page/2).
I'll add more info to this post if necessary, but I'm guessing there is something obvious I'm missing here, so: Anyone know how to expire the rest of my pages?
Here is a solution I've thought of, facing the same problem, though haven't implemented it yet. Cache the actual expiry time in its own key. The key would be a canonical representation of the search URL, ie without the "page" parameter. e.g.:
User searches on http://example.com?q=foo&page=3, so params is
{ q: 'foo', page: 3 }
. Strip out "page=3" and we're left with { q: 'foo' }.Run
to_param
on it and add some prefix, and we're left with a cache key likesearch_expiry_q=foo
.Look up cache for this canonical query, ie Rails.cache.read(
search_expiry_q=foo
). If it exists, we'll make our result expire at this time. Unfortunately, we only haveexpires_in
, notexpires_at
, so we'll have to do a calculation. i.e. something likeexpires_in: expiry_time - Time.now - 5.seconds
(the 5 seconds hopefully prevents any race conditions). We cache the full URL/params this way.OTOH if there's no expiry, then no-one's performed the search recently. So we do:
And cache this fragment/page, again with full URL/params, and expires_in: 1.hour.
I'm still interested in an answer to my original question and will change my accepted answer should a solution come up. That said, I ended up only caching the original page by checking if a page was specified at all:
caches_action :index, :if => Proc.new { params[:page].nil? }