handling redis maxmemory situations with rails whe

2019-07-30 11:57发布

问题:

When redis hits a 'maxmemory' condition, it will let the client do a read, but not a write.

This results in a fatal error of course... is there any way to make Rails handle a cache read OR write error, so if something bad happens to the cache (availability, read, write, etc), it will continue to run as if if caching was set to "off"?

回答1:

There are different behaviours that you can tell redis to abide by when it has filled up its memory.

# volatile-lru -> remove the key with an expire set using an LRU algorithm
# allkeys-lru -> remove any key accordingly to the LRU algorithm
# volatile-random -> remove a random key with an expire set
# allkeys->random -> remove a random key, any key
# volatile-ttl -> remove the key with the nearest expire time (minor TTL)
# noeviction -> don't expire at all, just return an error on write operations

the default is

# maxmemory-policy volatile-lru

Maybe the best options is 'volatile-ttl', and make sure that all your caches include the :expires_in options.

I'm no expert and I have not done this. This is just base on my current understanding of redis and rails.