This question already has an answer here:
- Howto detect that a network cable has been unplugged in a TCP connection? 3 answers
I am having a thread which blocks on select()
based SSL_read()
. The main thread writes whenever needed using SSL_write()
. During unit testing, I found a problem:
- Client TCP socket
connect()
to server on SSL(TLS) - After sometime, disable internet
- Write some data on client TCP/SSL socket
Even without interenet, the SSL_write()
returns the correct number of bytes written, instead of 0 or some error. Such interenet disconnections are arbitrary, neither too high nor too low.
My expectation is that, whenever there is an internet disconnection, the socket should generate some interesting event, which is detectable and I will discontinue the socket + SSL connections.
In case I have to establish some client-server hand made protocol, then that's possible.
What is the best way to achieve such internet detection?
Expecting a solution with lesser CPU cycles & lesser client server communication. I am sure that, this is not a very special problem and hence must have been solved before.
[P.S.: The client sockets are opened on mobile platforms like Android & iOS, however I would like to know a general solution.]