I have a server sending chunks of raw audio over a websocket. The idea is to retrieve those and play them in a way to have the smoothest playback possible.
Here is the most important piece of code:
ws.onmessage = function (event) {
var view = new Int16Array(event.data);
var viewf = new Float32Array(view.length);
audioBuffer = audioCtx.createBuffer(1, viewf.length, 22050);
audioBuffer.getChannelData(0).set(viewf);
source = audioCtx.createBufferSource();
source.buffer = audioBuffer;
source.connect(audioCtx.destination);
source.start(0);
};
This works decently well, but there are some cracks in the playback: the network latency is not always constant, so the newest chunk of data doesn't arrive exactly at the end of the previous one being played, so I can end up with either two buffers playing together for a short amount of time or none playing at all.
I tried:
- to hook the
source.onended
on playing the next one but it's not seamless: there is a crack at the end of every chunk and each seam is accumulating overall so the playback is getting more and more late compared to the stream. - to append the new data to the currently playing buffer, but this seem to be forbidden: buffers are of fixed size.
Is there a proper solution to fix that playback? The only requirement is to play the uncompressed audio coming from a websocket.
EDIT: Solution: Given I know my buffers lengths, I can schedule the playback this way:
if(nextStartTime == 0) nextStartTime = audioCtx.currentTime + (audioBuffer.length / audioBuffer.sampleRate)/2;
source.start(nextStartTime);
nextStartTime += audioBuffer.length / audioBuffer.sampleRate;
The first time, I schedule the beginning of the playback to half-a-buffer-later to allow that maximum unexpected latency. Then, I store the next buffer start time at the very end of my buffer end.