I have a non-blocking Java server that monitors all socket channels in a selector. Then I establish 500 connections to the server and regularly send data. Each piece of data received by the server is returned back to the client.
The problem occurs when the test works wonderfully for several hours, and then suddenly all the sockets that the server manages to throw an IOException timeout when trying to read data.
I examined if the client thread was hungry (and not send data), but I yield to the client thread that iterates through all the sockets and writes the data. Traffic seems to be constantly flowing properly, but after a while it just dies. What ideas can trigger this behavior?
I work on the Linux platform with the latest iteration of Java 6. My application runs two threads: one for the server and one for all clients. Thanks in advance!
Extras: The problem is with Linux, not with my code. When I run the same setup in a Windows window (on the same hardware), it never expires, but after a few hours they start to occur on Linux. On Linux, there must be some kind of TCP parameter that calls it. Thanks for the suggestion.
source share