I've built some code that will get the MediaRecorder API to capture audio and video, and then use the ondataavailable function to send the corresponding webm file blobs up to a server via websockets. The server then sends those blobs to a client via websockets which puts the video together in a buffer using the Media Source Extension API.
This works well, except that if I want to start a stream partway through, I can't just send the latest blob because the blob by itself is unplayable. Also, if I send the blobs out of order the browsers usually complain that the audio encoding doesn't match up.
I really don't know as much about video containers, codecs etc as I should to pull this off, but my question is, how can I play those blobs as standalone videos? Can I somehow use code to add the information which is in the first blob (playable on its own) onto the other blobs? What would be a good approach to being able to get the stream playing partway through? I would transcode but it seems to take too long since I want to set up real-time (or close to) streaming.
Thanks!