Too many open files: how many are open, what they are, and how many can the JVM open

JavaJvm

Java Problem Overview


I'm getting this exception in Java:

java.io.FileNotFoundException: (Too many open files) 

I'm looking for the ways to eliminate this problem.

This error obviously indicates that JVM has allocated too many handles and underlying OS won't let it have more. Either I've got leak somewhere with improperly closed connections/streams.

This process runs for days non-stop and eventually throws the exception. It repeatedly happens after 12-14 days of up-time.

How do you fight this? Is there a way to get a list of allocated handles in JVM or track when it hits certain amount? I'd love to have them printed and see how it grows and when. I can't use a profiler because it's a production system and have difficulties to reproduce it in development. Any suggestion?

I am monitoring free heap size and raising an "alarm" when it approaches 1% of the total specified in -Xmx. I also know that if my thread count hits above 500, then something definitely goes out of hand. Now, is there a way to know that my JVM allocates too many handles from OS and doesn't give them back, e.g. sockets, opened files, etc. If I'd knew that, I'd know where to look and when.

Java Solutions


Solution 1 - Java

You didn't say which OS you are running on, but if you are running on Linux you can use the lsof command

lsof -p <pid of jvm>

That will list all the files opened by the JVM. Or if you are running on Windows you can Process Explorer which will show all the open files for all the processes.

Doing this will hopefully allow you to narrow down which bit of the code is keeping the files open.

Solution 2 - Java

Since you are on Linux, I'd suggest, that you check the /proc-Filesystem. Inside proc, you will find a folder with the PID of your process containing a folder calld 'fd'. If your process id is 1234, the path is be

/proc/1234/fd

Inside that folder, you will find links to all opened files (do a 'ls -l'). Usually, you can tell by the filename which library / code might open and not close the file.

Solution 3 - Java

So, full answer (I combined answers from @phisch and @bramp). If you want to check all processes, you should use sudo. Also it's nice to save result to file - lsof is not cheap + this file could be useful for further investigation.

sudo lsof > lsof.log

Show bad guys (with UPDATE from @Arun's comment):

cat lsof.log | awk '{print $1 " " $2 " " $5}' | sort | uniq |awk '{ print $2 " " $1; }' | sort -rn | uniq -c | sort -rn | head -5

    2687 114970 java
    131 127992 nginx
    109 128005 nginx
    105 127994 nginx
    103 128019 nginx

Save list of file descriptors to file as well:

sudo ls -l /proc/114970/fd > fd.log

Show top open files:

cat fd | awk '{ print $11 }' | sort -rn | uniq -c | sort -rn | head -n20

Solution 4 - Java

You can change the limit of opened files by adding the following to /etc/security/limits.conf:

* soft nofile 2048 # Set the limit according to your needs
* hard nofile 2048

Then you can reload the configuration using sysctl -p on the shell. Check this article.

Just for completeness you can verify what is the current limit for opened files using: ulimit -n

Solution 5 - Java

If you are on MacOS

sudo launchctl limit maxfiles <hard> <soft>
sudo launchctl limit maxfiles 1024 200000

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionDimaView Question on Stackoverflow
Solution 1 - JavabrampView Answer on Stackoverflow
Solution 2 - JavaphischView Answer on Stackoverflow
Solution 3 - JavaJimilianView Answer on Stackoverflow
Solution 4 - JavagaboroncancioView Answer on Stackoverflow
Solution 5 - JavaSahan JayasumanaView Answer on Stackoverflow