I am working on a project where I am trying to calculate the latency of the packets received between two android devices using RTP.
So I went on to extend the RTP header with a unix time stamp in it's 12th-19th bytes.
I've received the packets now and tried to extract the unix time from them. However, I am doing something wrong in the decoding process as you can see in the screenshot. On the left, is the time I decoded from the packet, and on the right is the time of arrival. Please ignore the picture in the corner of my hand. (And sorry for the large resolution, not sure how to resize the image on SO.
I've converted the bytes to hex in order to try to debug the ginormous numbers I was getting when converting my byte array to a long. And I haven't noticed many clues, except for the consistent "41" in my hex values and "14" in my long values.
I'm currently out of ideas as to how to fix this. How do I extract the correct Unix Time in millis from my packet?
I'm using someone else's code to generate the bytes I'm putting in the packets, he uses this code to put the SSRC in the header (which is also 64-bits).
private void setLong(byte[] buffer, long n, int begin, int end) {
for (end--; end >= begin; end--) {
buffer[end] = (byte) (n % 256);
n >>= 8;
}
}
And my code utilizing the above method:
public void setUnixTime() {
for (int i=0;i<mBufferCount;i++) {
setLong(mBuffers[i], System.currentTimeMillis(),13,20);
}
}
I'm also interested in people's thoughts on calculating lag over RTP in this way (setting unix time on packets and comparing that time to time of arrival).