Every now and again I find myself doing something moderately dumb that results in my program allocating all the memory it can get and then some.
This kind of thing used to cause the program to die fairly quickly with an "out of memory" error, but these days Windows will go out of its way to give this non-existent memory to the application, and in fact is apparently prepared to commit suicide doing so. Not literally of course, but it will starve itself of usable physical RAM so badly that even running the task manager will require half an hour of swapping (after all the runaway application is still allocating more and more memory all the time).
This doesn't happen too often, but when it does it's disastrous. I usually have to reset my machine, causing data loss from time to time and generally a lot of inconvenience.
Do you have any practical advice on making the consequences of such a mistake less dire? Perhaps some registry tweak to limit the max amount of virtual memory an app is allowed to allocate? Or some CLR flag that will limit this only for the current application? (It's usually in .NET that I do this to myself.)
("Don't run out of RAM" and "Buy more RAM" are no use - the former I have no control over, and the latter I've already done.)
You could keep a command prompt open whenever you run a risky app. Then, if it starts to get out of control, you don't have to wait for Task Manager to load, just use:
taskkill /F /FI "MEMUSAGE ge 2000000"
This will (in theory) force kill anything using more than 2GB of memory.
Use taskkill /?
to get the full list of options it takes.
EDIT: Even better, run the command as a scheduled task every few minutes. Any process that starts to blow up will get zapped automatically.
There's something you can do: limit the working set size of your process. Paste this into your Main() method:
#if DEBUG
Process.GetCurrentProcess().MaxWorkingSet = new IntPtr(256 * 1024 * 1024);
#endif
That limits the amount of RAM your process can claim, preventing other processes from getting swapped out completely.
Other things you can do:
- Add more RAM, no reason to not have at least 3 Gigabytes these days.
- Defrag your paging file. That requires defragging the disk first, then defrag the paging file with, say, SysInternals' pagedefrag utility.
Especially the latter maintenance task is important on old machines. A fragged paging file can dramatically worsen swapping behavior. Common on XP machines that never were defragged before and have a smallish disk that was allowed to fill up. The paging file fragmentation causes a lot of disk head seeks, badly affecting the odds that another process can swap itself back into RAM in a reasonable amount of time.
The obvious answer would be to run your program inside of a virtual machine until it's tested to the point that you're reasonably certain such things won't happen.
If you don't like that amount of overhead, there is a bit of middle ground: you could run that process inside a job object with a limit set on the memory used for that job object.
In Windows you can control the attributes of a process using Job Objects
I usually use Task Manager in that case to kill the process before the machine runs of memory. TaskMan runs pretty well even as the machine starts paging pretty badly. After that the machine will usually recover. Later versions of Windows (such as 7) generally have more survivability in these situations than earlier versions. Running without DWM (turning off Aero themes in Vista and 7) generally also gives more time to invoke taskman to monitor and potentially kill off runaway processes.