Using the LRU Cache decorator found here: http://code.activestate.com/recipes/578078-py26-and-py30-backport-of-python-33s-lru-cache/
from lru_cache import lru_cache
class Test:
@lru_cache(maxsize=16)
def cached_method(self, x):
return x + 5
I can create a decorated class method with this but it ends up creating a global cache that applies to all instances of class Test. However, my intent was to create a per instance cache. So if I were to instantiate 3 Tests, I would have 3 LRU caches rather than 1 LRU cache that for all 3 instances.
The only indication I have that this is happening is when calling the cache_info() on the different class instances decorated methods, they all return the same cache statistics (which is extremely unlikely to occur given they are being interacted with very different arguments):
CacheInfo(hits=8379, misses=759, maxsize=128, currsize=128)
CacheInfo(hits=8379, misses=759, maxsize=128, currsize=128)
CacheInfo(hits=8379, misses=759, maxsize=128, currsize=128)
Is there a decorator or trick that would allow me to easily cause this decorator to create a cache for each class instance?
These days,
methodtools
will workYou need to install methodtools
If you are still using py2, then functools32 also is required
How about this: a function decorator that wraps the method with
lru_cache
the first time it's called on each instance?Use it like this:
Here's a gist on GitHub with some inline documentation.
Assuming you don't want to modify the code (e.g., because you want to be able to just port to 3.3 and use the stdlib
functools.lru_cache
, or usefunctools32
out of PyPI instead of copying and pasting a recipe into your code), there's one obvious solution: Create a new decorated instance method with each instance.