I am writing a batch script that goes through every file in a directory looking for code files and modifies them in some way. After I finished that task I tried to run it on a large directory with about 6,000 files. About 40 minutes in the script crashed and I got lots of out of memory errors from the command prompt, running the script while looking at the task manager showed that my program was eating memory at about a rate of 1MB per loop iteration. So naturally thinking I had done something wrong I cut out all the code that I had written to isolate the problem. But then I was left with an empty file except for a for loop and the problem still persisted!
Here is what I ran on a fairly large directory like I said:
@echo off
setlocal ENABLEEXTENSIONS ENABLEDELAYEDEXPANSION
set directory=%CD%
for /R "%directory%" %%a in (*.c *.cpp *.h *.idl) do (
set currentDir=%%~dpa
pushd !currentDir!
popd
)
I have actually been able to trim it down to:
@echo off
for /R "%CD%" %%a in (*) do echo
And the problem still persists.
Is there a memory leak in the batch for loop or am I just doing something wrong?
I am running Windows XP 32bit Service Pack 2 although I have tested and confirmed the problem is still present in Service Pack 3.
This is not so much an answer as a workaround (or rewrite). I have run your script and I see some memory consumption, which some of which is not released until the script finishes, but I do not get anything near 1 MB per iteration. (Just like @aphoria I can run the script on the root of c: without problems on XP SP3 and Vista.)
I suggest that you closing any process not necessary for the script and run your script on one subdirectory at a time.
If you cannot solve this in any other manner I suggest you try writing the script in Powershell.
The problem lies in the fact that CMD.EXE and COMMAND.COM set aside a specific amount of memory for usage. Generally when using batch files to do a lot of processing, you want to increase the memory workload that the script can use.
Depending on the OS Environment you are running, you can do the following:
WINDOWS NT/2000/XP: You can alter
COMMAND.COM environment memory with
the /e switch, for example:
command.com /e:2048 /c BatchFile.BAT
will run BatchFile.BAT in a shell with
2048 bytes environment memory.
WINDOWS 95/98/ME: You can change the
environment memory allocated in a
particular MS-DOS window or in the
Shortcut to a specific Batch file.
Open the Properties dialogue
(right-click the Shortcut or Desktop
icon, and click Properties).
Click the Memory Tab. Use the initial
environment pull-down box to set a
size that's enough for your variables,
say 2048 bytes (or more if you wish).
To be safe, I'd increase your disk memory usage as well.
Seems for /R
consumes memory for the internal files list. If run from a drive root it would show the memory consumption because a drive root basically has pretty many files recursively.
This one shows much more greater consumption:
@echo off
call :OUT_OF_MEMORY ^^
exit /b
:OUT_OF_MEMORY
call set __DUMMY=%%1
Use on your own risk, it can be unstoppable. :)
PS: Reproduces on Windows 7