I would like to implement a simple in-memory LRU cache system and I was thinking about a solution based on an IDictionary implementation which could handle an hashed LRU mechanism.
Coming from java, I have experiences with LinkedHashMap
, which works fine for what I need: I can't find anywhere a similar solution for .NET.
Has anyone developed it or has anyone had experiences like this?
This takes Martin's code with Mr T's suggestions and makes it Stylecop friendly. Oh, it also allows for disposal of values as they cycle out of the cache.
If it's an asp.net app you can use the cache class[1] but you'll be competing for space with other cached stuff, which may be what you want or may not be.
[1] http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx
There is nothing in the base class libraries that does this.
On the free side, maybe something like C5's HashedLinkedList would work.
If you're willing to pay, maybe check out this C# toolkit. It contains an implementation.
I've recently released a class called LurchTable to address the need for a C# variant of the LinkedHashMap. A brief discussion of the LurchTable can be found here.
Basic features:
Source Code: http://csharptest.net/browse/src/Library/Collections/LurchTable.cs
GitHub: https://github.com/csharptest/CSharpTest.Net.Collections
HTML Help: http://help.csharptest.net/
I don't believe so. I've certainly seen hand-rolled ones implemented several times in various unrelated projects (which more or less confirms this. If there was one, surely at least one of the projects would have used it).
It's pretty simple to implement, and usually gets done by creating a class which contains both a
Dictionary
and aList
.The keys go in the list (in-order) and the items go in the dictionary.
When you Add a new item to the collection, the function checks the length of the list, pulls out the last Key (if it's too long) and then evicts the key and value from the dictionary to match. Not much more to it really
I like Lawrence's implementation. Hashtable + LinkedList is a good solution. Regarding threading I would not lock this [MethodImpl(MethodImplOptions.Synchronized)], but rather using ReaderWriterLockSlim or spin lock (since contention usually fast) instead. In get function I would check if it's already the 1st item first, rather than always removing and adding. This gives you possibility to keep within a reader lock that not blocking other readers.