Caching in-memory with a time limit in python

2020-06-25 04:46发布

I have a function that returns a list, say list_x.

def result(val):
    ..
    return(list_x)

I am calling result() every minute and storing the list.

def other_func():
    #called every minute
    new_list = result(val)

I would like to store the value of new_list for an hour (in some sort of in-memory cache may be?) and then update it again, basically call results() after an hour and not every minute.I read about functools.lru_cache but that will not help here I think. Any ideas?

5条回答
霸刀☆藐视天下
2楼-- · 2020-06-25 05:36

A solution using ring

@ring.lru(expire=60*60)  # seconds
def cached_function(keys):
    return ...

If you don't need LRU policy,

@ring.dict(expire=60*60)  # seconds
def cached_function(keys):
    return ...
查看更多
迷人小祖宗
3楼-- · 2020-06-25 05:38

The ttl_cache decorator in cachetools==3.1.0 works a lot like functools.lru_cache, but with a time to live.

import cachetools.func

@cachetools.func.ttl_cache(maxsize=128, ttl=10 * 60)
def example_function(key):
    return get_expensively_computed_value(key)


class ExampleClass:
    EXP = 2

    @classmethod
    @cachetools.func.ttl_cache()
    def example_classmethod(cls, i):
        return i* cls.EXP

    @staticmethod
    @cachetools.func.ttl_cache()
    def example_staticmethod(i):
        return i * 3
查看更多
一夜七次
4楼-- · 2020-06-25 05:39

Create a function that acts as a cache, we'll call it result_cacher.

import time 
lastResultCache = 0 
resultCache = None
def result_cacher():
    if time.time() - lastResultCache >= 3600: #Checks if 3600 sec (1 hour) has passed since the last cache 
        lastResultCache = time.time()
        resultCache = result()
    return resultCache 

This function checks if an hour has passed, updates the cache if it has, and then returns the cache.

If you want to apply the caching for each individual input instead of for whenever the function is called, use dictionaries for lastResultCache and resultCache.

import time 
lastResultCache = {}
resultCache = {}
def result_cacher(val):
    #.get() gets value for key from dict, but if the key is not in the dict, it returns 0
    if time.time() - lastResultCache.get(val, 0) >= 3600: #Checks if 3600 sec (1 hour) has passed since the last cache 
        lastResultCache[val] = time.time()
        resultCache[val] = result(val)
    return resultCache.get(val)
查看更多
beautiful°
5楼-- · 2020-06-25 05:48

Building a single-element cache with a time-to-live is pretty trivial:

_last_result_time = None
_last_result_value = None
def result(val):
    global _last_result_time
    global _last_result_value
    now = datetime.datetime.now()
    if not _last_result_time or now - _last_result_time > datetime.timedelta(hours=1):
        _last_result_value = <expensive computation here>
        _last_result_time = now
    return _last_result_value

If you want to generalize this as a decorator, it's not much harder:

def cache(ttl=datetime.timedelta(hours=1)):
    def wrap(func):
        time, value = None, None
        @functools.wraps(func)
        def wrapped(*args, **kw):
            nonlocal time
            nonlocal value
            now = datetime.datetime.now()
            if not time or now - time > ttl:
                value = func(*args, **kw)
                time = now
            return value
        return wrapped
    return wrap

If you want it to handle different arguments, storing a time-to-live for each one:

def cache(ttl=datetime.timedelta(hours=1)):
    def wrap(func):
        cache = {}
        @functools.wraps(func)
        def wrapped(*args, **kw):
            now = datetime.datetime.now()
            # see lru_cache for fancier alternatives
            key = tuple(args), frozenset(kw.items()) 
            if key not in cache or now - cache[key][0] > ttl:
                value = func(*args, **kw)
                cache[key] = (now, value)
            return cache[key][1]
        return wrapped
    return wrap

You can of course key adding features to it—give it a max size and evict by time of storage or by LRU or whatever else you want, expose cache stats as attributes on the decorated function, etc. The implementation of lru_cache in the stdlib should help show you how to do most of the trickier things (since it does almost all of them).

查看更多
孤傲高冷的网名
6楼-- · 2020-06-25 05:51

A decorator usually solves this nicely

def cache(fn=None,time_to_live=3600*24): # one DAY default (or whatever)
    if not fn: return functools.partial(cache,time_to_live=time_to_live)
    my_cache = {}
    def _inner_fn(*args,**kwargs)
        kws = sorted(kwargs.items()) # in python3.6+ you dont need sorted
        key = tuple(args)+tuple(kw) 
        if key not in my_cache or time.time() > my_cache[key]['expires']:
               my_cache[key] = {"value":fn(*args,**kwargs),"expires":time.time()+ time_to_live}
        return my_cache[key]
    return __inner_fn

@cache(time_to_live=3600) # an hour
def my_sqrt(x):
    return x**0.5

@cache(time_to_live=60*30) # 30 mins
def get_new_emails():
    return my_stmp.get_email_count()

as an aside this is built into memcache and that may be a better solution (im not sure what problem domain you are working in)

you can use nested functions also

def cache(time_to_live=3600*24): # one DAY default (or whatever)
    def _wrap(fn):
        my_cache = {}
        def _inner_fn(*args,**kwargs)
            kws = sorted(kwargs.items()) # in python3.6+ you dont need sorted
            key = tuple(args)+tuple(kw) 
            if key not in my_cache or time.time() > my_cache[key]['expires']:
                 my_cache[key] = {"value":fn(*args,**kwargs),"expires":time.time()+ time_to_live}
            return my_cache[key]
         return _inner_fn
    return _wrap
查看更多
登录 后发表回答