Is there a limit on number of open files in Window

2019-01-01 15:56发布

I'm opening lots of files with fopen() in VC++ but after a while it fails.

Is there a limit to the number of files you can open simultaneously?

标签: c++ windows
7条回答
只若初见
2楼-- · 2019-01-01 16:18

Yes there are limits depending the access level you use when openning the files. You can use _getmaxstdio to find the limits and _setmaxstdio to change the limits.

查看更多
永恒的永恒
3楼-- · 2019-01-01 16:27

The C run-time libraries have a 512 limit for the number of files that can be open at any one time. Attempting to open more than the maximum number of file descriptors or file streams causes program failure. Use _setmaxstdio to change this number. More information about this can be read here

Also you may have to check if your version of windows supports the upper limit you are trying to set with _setmaxstdio. For more information on _setmaxstdio check here

Information on the subject corresponding to VS 2015 can be found here

查看更多
泛滥B
4楼-- · 2019-01-01 16:29

I don't know where Paulo got that number from.. In windows NT based operating systems the number of file handles opened per process is basically limited by physical memory - it's certainly in the hundreds of thousands.

查看更多
只若初见
5楼-- · 2019-01-01 16:31

If you use the standard C/C++ POSIX libraries with Windows, the answer is "yes", there is a limit.

However, interestingly, the limit is imposed by the kind of C/C++ libraries that you are using.

I came across with the following JIRA thread (http://bugs.mysql.com/bug.php?id=24509) from MySQL. They were dealing with the same problem about the number of open files.

However, Paul DuBois explained that the problem could effectively be eliminated in Windows by using ...

Win32 API calls (CreateFile(), WriteFile(), and so forth) and the default maximum number of open files has been increased to 16384. The maximum can be increased further by using the --max-open-files=N option at server startup.

Naturally, you could have a theoretically large number of open files by using a technique similar to database connections-pooling, but that would have a severe effect on performance.

Indeed, opening a large number of files could be bad design. However, some situations call require it. For example, if you are building a database server that will be used by thousands of users or applications, the server will necessarily have to open a large number of files (or suffer a performance hit by using file-descriptor pooling techniques).

查看更多
零度萤火
6楼-- · 2019-01-01 16:34

Came across the same problem, but using Embarcadero C++-Builder of RAD Studio 10.2. The C-runtime of that thing doesn't seem to provide _getmaxstdio or _setmaxstdio, but some macros and their default limit is much lower than what is said here for other runtimes:

stdio.h:

/* Number of files that can be open simultaneously
*/
#if defined(__STDC__)
#define FOPEN_MAX (_NFILE_)
#else
#define FOPEN_MAX (_NFILE_)
#define SYS_OPEN  (_NFILE_)
#endif

_nfile.h:

#if defined(_WIN64)
#define _NFILE_ 512
#else
#define _NFILE_ 50
#endif
查看更多
谁念西风独自凉
7楼-- · 2019-01-01 16:35

Yes, there is a limit.

The limit depends on the OS, and memory available.

In the old D.O.S. the limit was 255 simultaneuously opened files.

In Windows XP, the limit is higher (I believe it's 2,048 as stated by MSDN).

查看更多
登录 后发表回答