Especially when using recursive code there are massive improvements with lru_cache
. I do understand that a cache is a space that stores data that has to be served fast and saves the computer from recomputing.
How does the Python lru_cache
from functools work internally?
I'm Looking for a specific answer, does it use dictionaries like the rest of Python? Does it only store the return
value?
I know that Python is heavily built on top of dictionaries, however, I couldn't find a specific answer to this question. Hopefully, someone can simplify this answer for all the users on StackOverflow.
You can check out the source code here.
Essentially it uses two data structures, a dictionary mapping function parameters to its result, and a linked list to keep track of your function call history.
The cache is essentially implemented using the followings, which is pretty self-explanatory.
The gist of updating the linked list is:
Source of functools is available here: https://github.com/python/cpython/blob/3.6/Lib/functools.py
Lru_cache use
_lru_cache_wrapper
decorator (python decorator with arguments pattern) which havecache
dictionary in context (every decorated function have own cache dict) where it saves return value of called function. Dictionary key is generated with_make_key
function according to arguments. Added some bold comments: