I have an instance of AsyncSocket which I've been using as a server on an iPad, and then an AsyncSocket socket running on another iPad which is acting as the client. I have all the necessary code to exchange data between the client and server -- there are no issues there.
The problem I'm experiencing is that it all works fine, but during bug testing of my app I have noticed one particularly strange (and irritating issue):
If I turn off the server iPad (at which point none of the socket's delegates are fired on the server), then the client gets disconnected (and enters into a loop that I made where it continually retries). What's annoying is that even when the server comes back up, the client still can't connect to it. In fact, even if I start the client up again from scratch, it still can't connect to the server. I have to restart the server app in order for the client to be able to connect again.
The odd thing is that that this bug is only triggered when the server is actually switched "off" (i.e. put into standby) from the button at the top. If I just send the app to the background using the home button, then the client still maintains its connection to the server: it's only when the device is disconnected that the client receives the disconnect delegate message and disconnects, then refuses to reconnect. The server, meanwhile, is completely oblivious to this and no delegate methods are fired at all.
To generalise my question:
- What exactly happens to an AsyncSocket server instance when the device is put into standby using the button at the top of the iPad?
- Why are no delegate methods fired, yet any connected clients get disconnected?
- What happens when the device is turned on again?
- Why are clients unable to reconnect?