I'm working on a project in which I'd like to:
- Load a video js and display it on the canvas.
- Use filters to alter the appearance of the canvas (and therefore the video).
- Use the MediaStream captureStream() method and a MediaRecorder object to record the surface of the canvas and the audio of the original video.
- Play the stream of both the canvas and the audio in an HTML video element.
I've been able to display the canvas recording in a video element by tweaking this WebRTC demo code: https://webrtc.github.io/samples/src/content/capture/canvas-record/
That said, I can't figure out how to record the video's audio alongside the canvas. Is it possible to create a MediaStream containing MediaStreamTrack instances from two different sources/elements?
According to the MediaStream API's specs there should theoretically be some way to accomplish this: https://w3c.github.io/mediacapture-main/#introduction
"The two main components in the MediaStream API are the MediaStreamTrack and MediaStream interfaces. The MediaStreamTrack object represents media of a single type that originates from one media source in the User Agent, e.g. video produced by a web camera. A MediaStream is used to group several MediaStreamTrack objects into one unit that can be recorded or rendered in a media element."
Yes, you can do it using the
MediaStream.addTrack()
method.But Firefox will only use the initial stream's tracks into the Recorder until this bug has been fixed.
OP already known how to get all of it, but here is a reminder for future readers :
To get a videoStream track from the canvas, you can call
canvas.captureStream(framerate)
method.To get an audio streamTrack from a video element you can use the WebAudio API and it's
createMediaStreamDestination
method. This will return a MediaStreamDestination node (dest
) containing our audioStream. You'll then have to connect a MediaElementSource created from your video element, to thisdest
. If you need to add more audio tracks to this stream, you should connect all these sources todest
.Now that we've got two streams, one for the canvas video and one for the audio, we can use
canvasStream.addTrack(audioStream.getAudioTracks()[0])
just before initializing ournew MediaRecorder(canvasStream)
.Here is a complete example, that will work only in chrome now, and probably soon in Firefox, when they will have fixed the bug :
Ps : Since FF team seems to take some time to fix the bug, here is a quick fix to make it work on FF too.
You can also mix two tracks by using
new MediaStream([track1, track2])
.However, chrome currently prefixes this constructor, but since it does support
addTrack
, it's not really needed, and we can come with something as ugly asWorking fiddle for both FF and chrome.
Kaiido's demo is brilliant. For those just looking for the tl;dr code to add an audio stream to their existing canvas stream: