I've recently been trying to generating video in the browser, and have thus been playing with two approaches:
- Using the whammy js library to combine webp frames into webm video. More details here.
- Using
MediaRecorder
andcanvas.captureStream
. More details here.
The whammy approach works well, but is only supported in Chrome, since it's the only browser that currently supports webp encoding (canvas.toDataURL("image/webp")
). And so I'm using the captureStream
approach as a backup for Firefox (and using libwebpjs for Safari).
So now on to my question: Is there a way to control the video quality of the canvas stream? And if not, has something like this been considered by the browsers / w3c?
Here's a screenshot of one of the frames of the video generated by whammy:
And here's the same frame generated by the MediaRecorder/canvas.captureStream
approach:
My first thought is to artificially increase the resolution of the canvas that I'm streaming, but I don't want the output video to be bigger.
I've tried increasing the frame rate passed to the captureStream
method (thinking that there may be some strange frame interpolation stuff happening), but this doesn't help. It actually degrades quality if I make it too high. My current theory is that the browser decides on the quality of the stream based on how much computational power it has access to. This makes sense, because if it's going to keep up with the frame rate that I've specified, then something has to give.
So the next thought is that I should slow down the rate at which I'm feeding the canvas with images, and then proportionally lower the FPS value that I pass into captureStream
, but the problem with that is that even though I'd likely have solved the quality problem, I'd then end up with a video that runs slower than it's meant to.
Edit: Here's a rough sketch of the code that I'm using, in case it helps anyone in a similar situation.