What is the difference between LRU and LFU cache implementations?
I know that LRU can be implemented using LinkedHashMap
.
But how to implement LFU cache?
What is the difference between LRU and LFU cache implementations?
I know that LRU can be implemented using LinkedHashMap
.
But how to implement LFU cache?
the main difference is that in LRU we only check on which page is recently that used old in time than other pages i.e checking only based on recent used pages. BUT in LFU we check the old page as well as the frequency of that page and if frequency of the page is lager than the old page we cant remove it and if we all old pages are having same frequency then take last i.e FIFO method for that. and remove page....
Look at this resource
Let's consider a constant stream of cache requests with a cache capacity of 3, see below:
If we just consider a Least Recently Used (LRU) cache with a HashMap + doubly linked list implementation with O(1) eviction time and O(1) load time, we would have the following elements cached while processing the caching requests as mentioned above.
When you look at this example, you can easily see that we can do better - given the higher expected chance of requesting an A in the future, we should not evict it even if it was least recently used.
Least Frequently Used (LFU) cache takes advantage of this information by keeping track of how many times the cache request has been used in its eviction algorithm.