I have tried to implement different caching-implementations in a classic ASP site in order to offload the database during heavy traffic.
My approach was this:
Create a global HashTable object in global.asa where I later on store jscript-objects within
<object id="SIZE_LIST" progid="System.Collections.HashTable" runat="Server" scope="Application"></object>
This gives me a global HashTable-object that I at certain time-intervals replace the content of the HashTable. The size will only vary slightly, but I however do .Remove() and .Add() all objects, each time.
This works very well, besides from the fact that after a certain time, the memoryallocation of the app gets to high, giving irrational behaviour of the sessions. It will "forget" sessions, but not invoke OnSessionStart() in global.asa. Therefor, leaving visitors with an empty Session-collection.
Can I somehow improve the memory-reallocation-process? Is there any better approach for object-caching?
I have tried using plain textfiles with json-serialized data, but the deserialization of that is to much overhead. I thought about binary-serialization, but I'm not sure if it's even possible in classic ASP.
What are the reasons for using a .NET HashTable over a regular "Scripting.Dictionary"?
When you do classic ASP, why would a normal COM object not be enough?
don't try to reinvent the wheel. Use memcache. It's free and easy to setup. As a bonus is has ttl(time to live) and works beyond server boundaries
This is just a guess but it could be that because you are stashing JScript objects into a .net Hash that you could still be keeping a reference to the object so Garbage collection never gets a chance to do its job properly.
I've used Application with JSON strings before, works a treat and eval in lightening fast at re-hydrating the data.
If you are worried about your data getting out of date you could use a XMLHTTPRequest server side (which is suprisingly fast) to call a aspx page which will do the search and return a JSON string for you. The aspx page can handle caching for you with .net's fantastic caching and can even zip up the JSON string using SharpZipLib before stashing it in cache to reduce memory foot print even more. Which is also a great trick for stashing XML files in cache too (600K down to 25K in 0.0016 of a second!)
We've used all the above on projects before, not perfect but a good enough solution based on the limitations and legacy code at the time. And they are all very very fast (ok not by .net standards but for scripting its real fast).
I would opt for a more low tech approach and store each object separately (with string keys) in the Application object. That way you don't just have a single huge object from the scripting engine's point of view. Do you think that would help at all?