How can I use functools' lru_cache inside classes without leaking memory?
In the following minimal example the foo
instance won't be released although going out of scope and having no referrer (other than the lru_cache).
from functools import lru_cache
class BigClass:
pass
class Foo:
def __init__(self):
self.big = BigClass()
@lru_cache(maxsize=16)
def cached_method(self, x):
return x + 5
def fun():
foo = Foo()
print(foo.cached_method(10))
print(foo.cached_method(10)) # use cache
return 'something'
fun()
But foo
and hence foo.big
(a BigClass
) are still alive
import gc; gc.collect() # collect garbage
len([obj for obj in gc.get_objects() if isinstance(obj, Foo)]) # is 1
That means that Foo/BigClass instances are still residing in memory. Even deleting Foo
(del Foo
) will not release them.
Why is lru_cache holding on to the instance at all? Doesn't the cache use some hash and not the actual object?
What is the recommended way use lru_caches inside classes?
I know of two workarounds: Use per instance caches or make the cache ignore object (which might lead to wrong results, though)
This is not the cleanest solution, but it's entirely transparent to the programmer:
It takes the exact same parameters as
lru_cache
, and works exactly the same. However it never passesself
tolru_cache
and instead uses a per-instancelru_cache
.