-->

Use LinkedHashMap to implement LRU cache

2019-01-31 04:39发布

问题:

I was trying to implement a LRU cache using LinkedHashMap. In the documentation of LinkedHashMap (http://docs.oracle.com/javase/7/docs/api/java/util/LinkedHashMap.html), it says:

Note that insertion order is not affected if a key is re-inserted into the map.

But when I do the following puts

public class LRUCache<K, V> extends LinkedHashMap<K, V> {
    private int size;

    public static void main(String[] args) {
        LRUCache<Integer, Integer> cache = LRUCache.newInstance(2);
        cache.put(1, 1);
        cache.put(2, 2);
        cache.put(1, 1);
        cache.put(3, 3);

        System.out.println(cache);
    }

    private LRUCache(int size) {
        super(size, 0.75f, true);
        this.size = size;
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
        return size() > size;
    }

    public static <K, V> LRUCache<K, V> newInstance(int size) {
        return new LRUCache<K, V>(size);
    }

}

The output is

{1=1, 3=3}

Which indicates that the re-inserted did affected the order. Does anybody know any explanation?

回答1:

As pointed out by Jeffrey, you are using accessOrder. When you created the LinkedHashMap, the third parameter specify how the order is changed.

"true for access-order, false for insertion-order"

For more detailed implementation of LRU, you can look at this http://www.programcreek.com/2013/03/leetcode-lru-cache-java/



回答2:

But you aren't using insertion order, you're using access order.

order of iteration is the order in which its entries were last accessed, from least-recently accessed to most-recently (access-order)

...

Invoking the put or get method results in an access to the corresponding entry

So this is the state of your cache as you modify it:

    LRUCache<Integer, Integer> cache = LRUCache.newInstance(2);
    cache.put(1, 1); // { 1=1 }
    cache.put(2, 2); // { 1=1, 2=2 }
    cache.put(1, 1); // { 2=2, 1=1 }
    cache.put(3, 3); // { 1=1, 3=3 }


回答3:

Here is my implementation by using LinkedHashMap in AccessOrder. It will move the latest accessed element to the front which only incurs O(1) overhead because the underlying elements are organized in a doubly-linked list while also are indexed by hash function. So the get/put/top_newest_one operations all cost O(1).

class LRUCache extends LinkedHashMap<Integer, Integer>{
    private int maxSize;
    public LRUCache(int capacity) {
        super(capacity, 0.75f, true);
        this.maxSize = capacity;
    }

    //return -1 if miss
    public int get(int key) {
        Integer v = super.get(key);
        return v == null ? -1 : v;
    }

    public void put(int key, int value) {
        super.put(key, value);
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry<Integer, Integer> eldest) {
        return this.size() > maxSize; //must override it if used in a fixed cache
    }
}


回答4:

I also implement LRU cache with little change in code. I have tested and it works perfectly as concept of LRU cache.

package com.first.misc;
import java.util.LinkedHashMap;
import java.util.Map;

public class LRUCachDemo {
 public static void main(String aa[]){
     LRUCache<String, String> lruCache = new LRUCache<>(3);
     lruCache.cacheable("test", "test");
     lruCache.cacheable("test1", "test1");
     lruCache.cacheable("test2", "test2");
     lruCache.cacheable("test3", "test3");
     lruCache.cacheable("test4", "test4");
     lruCache.cacheable("test", "test");


     System.out.println(lruCache.toString());
 }
}

class LRUCache<K, T>{
    private Map<K,T> cache;
    private int windowSize;

    public LRUCache( final int windowSize) {
        this.windowSize = windowSize;
        this.cache = new LinkedHashMap<K, T>(){

            @Override
            protected boolean removeEldestEntry(Map.Entry<K, T> eldest) {
                return size() > windowSize;
            }
        };

    }


    // put data in cache
    public void cacheable(K key, T data){
        // check key is exist of not if exist than remove and again add to make it recently used
        // remove element if window size is exhaust
        if(cache.containsKey(key)){
            cache.remove(key);
        }

        cache.put(key,data);

    }

    // evict functioanlity

    @Override
    public String toString() {
        return "LRUCache{" +
                "cache=" + cache.toString() +
                ", windowSize=" + windowSize +
                '}';
    }
}