In my current project I discovered a problem using websockets with socket.io and node.js on mobile devices. It seems that there is a problem for mobile deviced handling socket messages in an interval.
I reduced it to a minimal scenario:
The server (minimal express.js server) sends messages to the client in an specific interval:
setInterval(function(){
socket.emit('interval');
}, 500);
The client just messures the time between the received messages and displays them:
socket.on('interval', function (data) {
timeElement.html(new Date() - startTime);
startTime = new Date();
});
On a desktop
(using Chrome) the resulting time between the messages is pretty stable at 515 ms
. So its like a 15ms delay but message interval is consistent.
On a mobile device
(I'm using a Galaxy Nexus with Chrome) the time varies between 400 and 600 ms
with some more extrem spikes in either way.
I want to use such an interval as a game turn indicator and this problem results in a lot of lagging and uneven player movement on mobile devices.
You have no promises about when a network packet will get to you. If you're not using a reliable protocol like TCP, you can't be sure if a packet will get to you. Sometimes packets get dropped.
As it happens, you are using TCP (which is the underlying network-level protocol for WebSockets). When a TCP packet is lost, the network must identify the loss, and then resend the packet, causing a delay. It's possible that the mobile device's physical network card simply is more prone to drop packets than the physical card on your desktop.
Network message intervals are a terrible mechanism for reliable timing.
Another -- or an additional -- possible explanation (if you suspect packet loss is unlikely) is that on such a low-power mobile device, the network driver simply isn't getting enough time on the CPU (or is not getting its CPU time quite soon enough), so there are driver-level delays in processing the queue of packets received on the physical network device.