I'm opening lots of files with fopen() in VC++ but after a while it fails.
Is there a limit to the number of files you can open simultaneously?
I'm opening lots of files with fopen() in VC++ but after a while it fails.
Is there a limit to the number of files you can open simultaneously?
Yes there are limits depending the access level you use when openning the files. You can use
_getmaxstdio
to find the limits and_setmaxstdio
to change the limits.The C run-time libraries have a 512 limit for the number of files that can be open at any one time. Attempting to open more than the maximum number of file descriptors or file streams causes program failure. Use
_setmaxstdio
to change this number. More information about this can be read hereAlso you may have to check if your version of windows supports the upper limit you are trying to set with
_setmaxstdio
. For more information on_setmaxstdio
check hereInformation on the subject corresponding to VS 2015 can be found here
I don't know where Paulo got that number from.. In windows NT based operating systems the number of file handles opened per process is basically limited by physical memory - it's certainly in the hundreds of thousands.
If you use the standard C/C++ POSIX libraries with Windows, the answer is "yes", there is a limit.
However, interestingly, the limit is imposed by the kind of C/C++ libraries that you are using.
I came across with the following JIRA thread (http://bugs.mysql.com/bug.php?id=24509) from MySQL. They were dealing with the same problem about the number of open files.
However, Paul DuBois explained that the problem could effectively be eliminated in Windows by using ...
Naturally, you could have a theoretically large number of open files by using a technique similar to database connections-pooling, but that would have a severe effect on performance.
Indeed, opening a large number of files could be bad design. However, some situations call require it. For example, if you are building a database server that will be used by thousands of users or applications, the server will necessarily have to open a large number of files (or suffer a performance hit by using file-descriptor pooling techniques).
Came across the same problem, but using Embarcadero C++-Builder of RAD Studio 10.2. The C-runtime of that thing doesn't seem to provide
_getmaxstdio
or_setmaxstdio
, but some macros and their default limit is much lower than what is said here for other runtimes:stdio.h:
_nfile.h:
Yes, there is a limit.
The limit depends on the OS, and memory available.
In the old D.O.S. the limit was 255 simultaneuously opened files.
In Windows XP, the limit is higher (I believe it's 2,048 as stated by MSDN).