I have a function that returns a list, say list_x.
def result(val):
..
return(list_x)
I am calling result() every minute and storing the list.
def other_func():
#called every minute
new_list = result(val)
I would like to store the value of new_list for an hour (in some sort of in-memory cache may be?) and then update it again, basically call results() after an hour and not every minute.I read about functools.lru_cache but that will not help here I think. Any ideas?
A solution using
ring
If you don't need LRU policy,
The
ttl_cache
decorator incachetools==3.1.0
works a lot likefunctools.lru_cache
, but with a time to live.Create a function that acts as a cache, we'll call it
result_cacher
.This function checks if an hour has passed, updates the cache if it has, and then returns the cache.
If you want to apply the caching for each individual input instead of for whenever the function is called, use dictionaries for
lastResultCache
andresultCache
.Building a single-element cache with a time-to-live is pretty trivial:
If you want to generalize this as a decorator, it's not much harder:
If you want it to handle different arguments, storing a time-to-live for each one:
You can of course key adding features to it—give it a max size and evict by time of storage or by LRU or whatever else you want, expose cache stats as attributes on the decorated function, etc. The implementation of
lru_cache
in the stdlib should help show you how to do most of the trickier things (since it does almost all of them).A decorator usually solves this nicely
as an aside this is built into memcache and that may be a better solution (im not sure what problem domain you are working in)
you can use nested functions also