We are designing a Stress Test Application which will send mass HTTP requests of size "1 MB" to a particular Web Service. To achieve stress, we are using multiple threads in the application. Structure is something like we have X EnqueueThreads which will create the HTTPRequest data and they will add it them to the queue. And the Y WorkerThreads will dequeue the requests and they will submit to web service.
All requests are aysnchronous.
Now the problem here is, Enqueue threads are working much faster than WorkerThreads so if there is no stop/wait condition, they will add the requests until the out of memory exception is thrown and thus making the injector (where this utility will be running) slow.
At present we are handling the OutOfMemory exceptions and making the enqueuethreads sleep for some time. Another way I could think of is limit the queue size.
However I would like to know the views on what should be the best approach to use the limited system resources (specially memory).?
Thanks in advance.
Well, according to the topic of the question, best way to avoid out of memory exception would be not to create objects that fill in that memory.
Handling exception is easiest solution, though may bring different difficulties and inconsistencies into the application with time. Another way would be getting available size of memory resources like this:
Process currentProcess = Process.GetCurrentProcess();
long memorySize = currentProcess.PrivateMemorySize64
Then you can calculate the length of your queue based on estimate of one object memory capacity.
Another way would be to check for memory size in each worker thread. Whenever there's no memory, the thread can just finish. This way many threads would be spawning and dying but the application would be at maximum available capacity.
You can and probably should use the MemoryFailPoint class in a scenario like this.
If you get an OutOfMemoryException then the application state could be corrupt and you shouldn't try to recover from it. MemoryFailPoint is designed to avoid this by allowing you to determine how much to slow your application down so that you can avoid getting out of memory. You are letting the framework determine if you can perform the operation and not taking a guess on how much you "think" you can get away with based on how much memory your app is using.
You should also check for memory usage through the garbage collector not the process to get an accurate reading of how much managed memory is actually allocated. Using private memory size will give you a much lower reading and you could still land up in an out of memory situation although it appears you have plenty to spare.
The code sample on the MSDN page shows how to estimate memory usage for an operation and to use that information to wait until memory is available before trying to process more requests. If you can determine the areas of code that have large memory requirements this is a good option to constrain it and avoid running out of memory.