I know it's simple to implement, but I want to reuse something that already exist.
Problem I want to solve is that I load configuration (from XML so I want to cache them) for different pages, roles, ... so the combination of inputs can grow quite much (but in 99% will not). To handle this 1%, I want to have some max number of items in cache...
Till know I have found org.apache.commons.collections.map.LRUMap in apache commons and it looks fine but want to check also something else. Any recommendations?
Here is a very simple and easy to use LRU cache in Java. Although it is short and simple it is production quality. The code is explained (look at the README.md) and has some unit tests.
You can use a LinkedHashMap (Java 1.4+) :
Here is my implementation which lets me keep an optimal number of elements in memory.
The point is that I do not need to keep track of what objects are currently being used since I'm using a combination of a LinkedHashMap for the MRU objects and a WeakHashMap for the LRU objects. So the cache capacity is no less than MRU size plus whatever the GC lets me keep. Whenever objects fall off the MRU they go to the LRU for as long as the GC will have them.
This is an old question, but for posterity I wanted to list ConcurrentLinkedHashMap, which is thread safe, unlike LRUMap. Usage is quite easy:
And the documentation has some good examples, like how to make the LRU cache size-based instead of number-of-items based.
I also had same problem and I haven't found any good libraries... so I've created my own.
simplelrucache provides threadsafe, very simple, non-distributed LRU caching with TTL support. It provides two implementations
You can find it here.