I'm trying to create a multiplayer game with NodeJS and I want to synchronize the action between clients.
What would be the best way to find the latency (the time that a request take to come back to the client) between the client and the server?
My first idea was that the client #1 could send a timestamp with is request, so when client #2 will receive the action of the client #1 he will adjust is action speed to remove the delay of the request. But the problem is that maybe the system date time of the two clients are not identical so it is not possible two know the reel delay on the request of client #1.
The other solution was to use the timestamp of the server, but now how can I know the latency of a client?
Overview:
After socket.io connection has been established, you create a new
Date
object on the client, let's call itstartTime
. This is your initial time before making a request to the server. You then emit aping
event from the client. Naming convention is totally up to you. Meanwhile server should be listening for aping
event, and when it receives theping
, it immediately emits apong
event. Client then catches thepong
event. At this time you want to create another date object that representsDate.now()
. So at this point you have two date objects - initial date before making a request to the server, and another date object after you make a request to the server and it replied. Subtract thestartTime
from current time and you have thelatency
.Client
Server
Also available as a Github Gist.
Heres my really quick and dirty script to test the ping ... just head to http://yourserver:8080 in your browser and watch the console (ssh terminal for me).
I'm very curious about this because it seems like my pings are pretty high(200-400ms round trip) on large vps boxes w/ dedicated resources both in california and new jersey. (I'm on the east coast) I'm betting theres just a lot of latency on the vps boxes b/c they're serving so much traffic?
The thing that gets me is that a regular ping from the linux terminal from the same client to the same server is 11ms on average a factor of 10 lower ... am I doing something wrong or is something slow with node.js/socket.io/websockets?
What I usually do to send timestamp with request:
new Date()
and sendtimestamp: date.getTime()
to the server, with every JSON request.processed: (new Date()).getTime()
in the object.timestamp
from the request, and a new processed field:processed: (new Date()).getTime() - req.processed
that now contains the number of milliseconds it took to process the request.timestamp
(which is the same that was sent on pt 1) and subtract it from the current time, and subtract processing time (processed
), and there is your "real" ping time in milliseconds.I think you should always include the time for both request and response in the ping time, even if there is one-way communication. This is because that is the standard meaning behind "ping time" and "latency". And if it is one-way communication and the latency is only half of the real ping time, that's just a "good thing".
I'm going to assume you are using WebSockets or Socket.IO since you are implementing a game where latency matters (and you tagged it as such).
I would think the server should probably measure and keep track of this for each client.
You probably want to implement some sort of ping action that the server can request of the client. As soon as the client receives the request, it sends back a response to the server. The server then divides by 2 and updates the latency for that client. You probably want the server to do this periodically with each client and probably average the last several so that you don't get strange behavior from sudden but temporary spikes.
Then, when there is a message from one client that needs to be sent (or broadcast) to another client, the server can add client1's latency to client2's latency and communicate this as the latency offset to client2 as part of the message. client2 will then know that the event on client1 happened that many milliseconds ago.
An additional reason to do this on the server is that some browser Javascript timestamps are inaccurate: http://ejohn.org/blog/accuracy-of-javascript-time/. I suspect node.js timestamps are just as accurate (or more so) than V8 (which is one of the few accurate ones).
After reading all these answers...
...I still wasn't satisfied. I visited the official docs and well, well, well - the solution is already built-in.
You just need to implement it - check out mine:
Client
Server
Read first — Due to repeated questions why this is supposed to work, let me clarify a bit.
start
variable containing the time stamp. This is the ack() argument in socket.io.socket.io
allows to define a callback funtion, which appears to be executed by the server, but this actually just passes the function arguments through the web socket, and the client then calls the callback.What happens below (please do check the example code!):
1453213686429
instart
ping
event to the server and is waiting for an answerclientCallback
with empty arguments (Check the demo code if you want to see arguments)clientCallback
again takes the current timestamp on the client, e.g.1453213686449
, and knows that20 ms
have passed since it sent the request.Imagine the druid (client) holding a stopwatch and pushing the button when the messenger (event) starts running, and pushing it again when the messenger arrives with his scroll (function arguments). The druid then reads the scroll and adds the ingredient names to his potion recipe and brews the potion. (callback)
Okay, forget the previous paragraph, I guess you got the point.
Although the question has already been answered, here a short implementation for checking the RTT with
socket.io
:Client
Server
Demo Code
Demo code as node module: socketIO-callback.tgz Set it up and run it with
and then navigate to http://localhost:5060