In an ASP.NET 2.0 site on IIS6 I would like to store Key / Value pairs in the Application Cache. Each Key will always be a string with a 5 character length and each Value a string of 15 - 250 characters length.
The usage scenario is that the Cache will be queried once per webpage request, if the Key exists use the Value otherwise query a database and either add a new Key / Value to the Cache or replace an existing entry based upon some application logic.
In this scenario I envisage / require the Cache size to reach circa 1000 entries at which size it will become stable and will rarely (if at all) be changed as described above.
Before I just "performance test it myself" does anyone have any experience of large amounts of Cached data as to whether it is preferable for Performance to:
(1) Use 1 Cache object containing a SortedDictionary<string, string>
or
(2) allow the creation of 1,000 Cache objects and the use the Cache itself as a dictionary or
(3) It just doesn't matter for the amount of data in question. In which case would your answer change if the number of entries increased to 10,000 or 100,000?
Many Thanks.
1000 is not a large amount of data; that will work fine, but you will need to think about synchronization if this data is shared between requests. In reality a
lock
to make access to aDictionary<string,string>
is probably fine, although you can be more fine-grained if you need.However, the inbuilt web cache (
HttpContext.Cache
) will also approach this same problem, and has all the thread-safety built in.Don't use
SortedDictionary<,>
unless you have a care that the data is sorted. I don't think you do.As numbers get larger, I'd be more inclined to think about stores such as redis / memcached, with local memory as a local shortcut.