Is there a limit on the number of open files in Windows?

I open a lot of files with fopen () in VC ++, but after a while it fails.

Is there a limit on the number of files that you can open at a time?

+32
c ++ windows
May 15 '09 at 18:31
source share
7 answers

C runtime libraries have a limit of 512 for the number of files that can be opened at any given time. Attempting to open more than the maximum number of file descriptors or file streams causes the program to crash. Use _setmaxstdio to change this number. You can read more about this here.

You may also need to check if your version of Windows supports the upper limit that you are trying to set with _setmaxstdio . For more information on _setmaxstdio check here

Information on the subject of VS 2015 can be found here.

+50
May 15 '09 at 18:42
source share

If someone is still unclear as to what the limit applies to, I believe that this is the limit for each process, not a system-wide one.

I just wrote a small test program to open files until it works. It receives up to 2045 files before the crash (2045 + STDIN + STDOUT + STDERROR = 2048), then I left it open and made another copy.

The second copy showed the same behavior, that is, at least 4096 files immediately opened.

+10
Jun 18 '09 at 10:04
source share

If you use the standard POSIX C / C ++ libraries with Windows, the answer is yes, there is a limit.

However, interestingly, the limit is imposed by the types of C / C ++ libraries that you use.

I came across the following JIRA thread ( http://bugs.mysql.com/bug.php?id=24509 ) from MySQL. They faced the same problem about the number of open files.

However, Paul DuBois explained that the problem can be effectively fixed in Windows using ...

Win32 API call (CreateFile (), WriteFile (), etc.) And the maximum number of open files by default was increased to 16384. The maximum can be increased using the -max-open-files = N option to the server.

Naturally, you could have a theoretically large number of open files using a technique similar to combining database connections, but that would have a serious impact on performance.

Indeed, opening a large number of files can be a poor design. However, some situations require this. For example, if you create a database server that will be used by thousands of users or applications, the server will definitely have to open a large number of files (or suffer from a performance hit using file and descriptor merge methods).

+10
Nov 25 '10 at 11:08
source share

Yes, there are restrictions depending on the level of access that you use when opening files. You can use _getmaxstdio to find limits and _setmaxstdio to change limits.

+7
May 15, '09 at 18:36
source share

I don’t know where Paulo got this number from. On Windows NT-based operating systems, the number of file descriptors open for each process is mostly limited by physical memory β€” of course, hundreds of thousands.

+3
May 16 '09 at 18:40
source share

Yes, there is a limit.

The limit depends on the OS and available memory.

In the old DOS, the limit was 255 simultaneously open files.

In Windows XP, the limit is higher (I believe it is 2.048, as indicated by MSDN ).

0
May 15, '09 at 18:36
source share

The same problem happened, but using Embarcadero C ++ - Builder RAD Studio 10.2. The C-runtime of this thing does not seem to provide _getmaxstdio or _setmaxstdio , but some macros and their default limit are very lower than what is stated here for other working times:

stdio.h:

 /* Number of files that can be open simultaneously */ #if defined(__STDC__) #define FOPEN_MAX (_NFILE_) #else #define FOPEN_MAX (_NFILE_) #define SYS_OPEN (_NFILE_) #endif 

_nfile.h:

 #if defined(_WIN64) #define _NFILE_ 512 #else #define _NFILE_ 50 #endif 
-one
Oct. 25 '17 at 9:14
source share



All Articles