SetSoTimeout on the client socket does not affect the socket

I have a Java application with three threads that open, each, a socket and connect to the server on different ports. I set so_timeout on each of these sockets after establishing a connection to the server. After that, thread blocks wait while reading (). Only one of the threads expires after 20 seconds (this is the timeout that I set). The other two ignore the timeout. Is it possible that the TCP layer only processes one timeout at a time? Are there any other explanations?

+2
java timeout sockets
Aug 20 '09 at 13:16
source share
2 answers

In the past, I had several problems with SO_TIMEOUT in windows. I believe that the installation is "supposed" to install a basic socket implementation, which may be OS dependent and contrary to registry settings, etc.

My advice is to not use SO_TIMEOUT to force exclusion of timeout exceptions. Use either non-blocking I / O, or make sure you have bytes () before reading ().

+3
Aug 20 '09 at 20:43
source share

The documentation says:

The option must be activated before the lock operation is entered.

perhaps you should set it before the connection to the server is established, at least before calling the read () function on the socket.
But it's hard to say without a code ...

+2
Aug 20 '09 at 13:46
source share



All Articles