Is there a way to cache Python 3.5 definitions usi

2019-07-27 05:30发布

Currently, I use functools' lru_cache to handle my caching for the function. The problem is that the size of the cache never grows large enough to make use of the LRU (due to the fact that the function never takes in a parameter). Instead, the function, when called, opens up a specific URL and return its contents.

Is there a way in which I can specify the 'time to live' cache in which after a certain amount of time/certain amount of calls, it refreshes its cache?

2条回答
Lonely孤独者°
2楼-- · 2019-07-27 06:09

The functools.lru_cache function accepts a maxsize argument which saves the results up to the maxsize most recent calls.

You can check this by calling the cache_info attribute of your decorated function.

If you want to refresh your cache completely you should implement a cash object manually by counting the number of cache calls and resetting the cache whenever it hits the max size.

from functools import wraps


class Mycache(object):
    def __init__(self, maxcount):
        self.count = 0
        self.maxcount = maxcount
        self.cache = {}

    def __call__(self, func):

        @wraps(func)
        def wrapped(*args):
            self.count += 1
            if self.count > self.maxcount:
                self.cache = {}
                self.count = 0
                result = self.cache[args] = func(*args)
            else:
                try:
                    result = self.cache[args]
                except KeyError:
                    result = self.cache[args] = func(*args)
            return result
        return wrapped

Demo:

@Mycache(3)
def a(arg):
    print("arg is : {}".format(arg))
    return arg ** 2

print(a(4))
print(a(4))
print(a(4))
print(a(4))
print(a(3))
print(a(4))

output:

arg is : 4
16
16
16
arg is : 4
16
arg is : 3
9
16
查看更多
小情绪 Triste *
3楼-- · 2019-07-27 06:13

I don't know of a decorator, but you can keep track of the last time you got the page and update as needed. If this is a single-threaded app, it can be simple

_cached_page = ''
_cached_page_time = 0

def get_page():
    global _cached_page, _cached_page_time
    now = time.time()
    # invalidate in 1 hour
    if not _cached_page or now - _cached_page_time > 60 * 60:
        _cached_page = get_the_page_here()
        _cached_page_time = time.time()
    return _cached_page

You could also age the page with a timer in the background. You need to control access with a lock, but that makes the cache usable in a multithreaded program too.

_cached_page = ''
_cached_page_lock = threading.Lock()

def _invalidate_page():
    global _cached_page
    with _cached_page_lock:
        _cached_page = ''

def get_page():
    global _cached_page
    with _cached_page_lock:
        if not _cached_page:
            _cached_page = get_the_page_here()
            # invalidate in 1 hour
            threading.Timer(60*60, _invalidate_page)
        return _cached_page

Finally, the server may include an Expires: ... field in the http header. Depending on how well the service was written, this would be a good reflection of how long the page can be cached.

查看更多
登录 后发表回答