Please don't say EHCache or OSCache, etc. Assume for purposes of this question that I want to implement my own using just the SDK (learning by doing). Given that the cache will be used in a multithreaded environment, which datastructures would you use? I've already implemented one using LinkedHashMap and Collections#synchronizedMap, but I'm curious if any of the new concurrent collections would be better candidates.
UPDATE: I was just reading through Yegge's latest when I found this nugget:
If you need constant-time access and want to maintain the insertion order, you can't do better than a LinkedHashMap, a truly wonderful data structure. The only way it could possibly be more wonderful is if there were a concurrent version. But alas.
I was thinking almost exactly the same thing before I went with the LinkedHashMap
+ Collections#synchronizedMap
implementation I mentioned above. Nice to know I hadn't just overlooked something.
Based on the answers so far, it sounds like my best bet for a highly concurrent LRU would be to extend ConcurrentHashMap using some of the same logic that LinkedHashMap
uses.
I like lots of these suggestions, but for now I think I'll stick with
LinkedHashMap
+Collections.synchronizedMap
. If I do revisit this in the future, I'll probably work on extendingConcurrentHashMap
in the same wayLinkedHashMap
extendsHashMap
.UPDATE:
By request, here's the gist of my current implementation.
Hope this helps .
Here is my short implementation, please criticize or improve it!
Wanted to add comment to the answer given by Hank but some how I am not able to - please treat it as comment
LinkedHashMap maintains access order as well based on parameter passed in its constructor It keeps doubly lined list to maintain order (See LinkedHashMap.Entry)
@Pacerier it is correct that LinkedHashMap keeps same order while iteration if element is added again but that is only in case of insertion order mode.
this is what I found in java docs of LinkedHashMap.Entry object
this method takes care of moving recently accessed element to end of the list. So all in all LinkedHashMap is best data structure for implementing LRUCache.
Another thought and even a simple implementation using LinkedHashMap collection of Java.
LinkedHashMap provided method removeEldestEntry and which can be overridden in the way mentioned in example. By default implementation of this collection structure is false. If its true and size of this structure goes beyond the initial capacity than eldest or older elements will be removed.
We can have a pageno and page content in my case pageno is integer and pagecontent i have kept page number values string.
Result of above code execution is as follows:
Here is my tested best performing concurrent LRU cache implementation without any synchronized block:
}