When viewing the peer code found below the code
BufferedReader br = new BufferedReader(new FileReader(PATH + fileName));
just read the file and concatenate these lines as one line, but I did not find any closed code. Therefore, I think that this should cause a resource leak and, finally, cause too many open files error , therefore, to prove this, I write a test
for (int i = 0; i < 7168; i++) { // ulimit -n ==> 7168 BufferedReader br = new BufferedReader(new FileReader("src/main/resources/privateKey/foo.pem")); System.out.println(br.readLine()); } System.in.read();
Very strange, everything is in order, does not generate the expected exception.
And check the real open files on the command line
β ~ lsof -p 16276 | grep 'foo.pem' | wc -l 2538
why only 2538 and not 7168?
So what happened? how to call too many open files error ?
As @GhostCat suggested, change 7168 -> Integer.MAX_VALUE, this time it called
java.io.FileNotFoundException: src/main/resources/privateKey/foo.pem (Too many open files in system) at java.io.FileInputStream.open0(Native Method) at java.io.FileInputStream.open(FileInputStream.java:195)
when i'm 27436 , in which case check for real open files on the command line
β ~ lsof | grep foo.pem | wc -l 7275
but where are the left files (27346 - 7275)? and why is the ulimit number not working?