I'm currently sending a video stream to Chrome, to play via the MediaSource API.
As I understand it, MediaSource only supports MP4 files encoded with MPEG-DASH, or WebM files that have clusters beginning with keyframes (otherwise it raises the error: Media segment did not begin with keyframe).
Is there any way to encode in MPEG-DASH or keyframed WebM formats with FFMPEG in real-time?
Edit:
I just tried it with ffmpeg ... -f webm -vcodec vp8 -g 1
so that every frame is a keyframe. Not the ideal solution. It does work with MediaStream now though. Any way to sync up the segments with the keyframes in WebM so not every frame needs to be a keyframe?
Reference Questions on WebM / MP4 and MediaSource:
Media Source Api not working for a custom webm file (Chrome Version 23.0.1271.97 m)
MediaSource API and mp4
To ensure every cluster in your WebM starts with a keyframe, try something like this:
ffmpeg \
[...inputs] \
-vcodec libvpx \
-keyint_min 60 \
-g 60 \
-vb 4000k \
-f webm \
-cluster_size_limit 10M \
-cluster_time_limit 2100 \
[...output]
Basically, as implemented, every keyframe has to be at the beginning of a cluster but the inverse is not true. That is, on key frame there will be a new cluster, but on new cluster there won't necessarily be a keyframe. To get around this issue, we simply set the cluster size to something large that we'll never hit.
In this example, we'll have a keyframe every 2 seconds, and the cluster time limit is 2.1 seconds, so we'll never hit it. The bitrate is 4Mbit, and the cluster size limit is 10M-something. Not sure if it's bit or byte there but it doesn't matter as we'll never hit it as I've set it much greater than it needs to be.
At the moment FFMPEG does not support DASH encoding. You can segment with FFMPEG (https://www.ffmpeg.org/ffmpeg-formats.html#segment_002c-stream_005fsegment_002c-ssegment), but I recommend combining FFMPEG and MP4Box. Use FFMPEG to transcode your live video, and then MP4Box to segment and create the .mpd index.
MP4Box is a part of GPAC (http://gpac.wp.mines-telecom.fr/).
Here is an example using h264:
ffmpeg -threads 4 -f v4l2 -i /dev/video0 -acodec libfaac -ar 44100 -ab 128k -ac 2 -vcodec libx264 -r 30 -s 1280x720 -f mp4 -y "$movie" > temp1.mp4 && MP4Box -dash 10000 -frag 1000 -rap "$movie"
If you need VP8 (WebM), use: -vcodec libvpx
and -f webm
or -f ts
.
Another user has had some luck with:
ffmpeg ... \
-f mp4 \
-reset_timestamps 1 \
-movflags empty_moov+default_base_moof+frag_keyframe \
-probesize 200000
Please see see galbarm's questions at:
- Live streaming dash content using mp4box
- Flush & Latency Issue with Fragmented MP4 Creation in FFMPEG
Note: If you don't have keyframes on the input video, you may need to set:
-frag_duration 100000
... instead of +frag_keyframe
.
I ran into the same situation when trying to play recorded .webm file by MediaRecorder API back using Media Source Extensions (MSE). Chrome (51) recordings are malformed, Firefox (46) seems OK.
To get it working you have to fix cues in .webm file:
- clone https://github.com/webmproject/libwebm
- make sure you have cmake version >= 3.2 (https://askubuntu.com/questions/610291/how-to-install-cmake-3-2-on-ubuntu-14-04)
cmake .
make
./sample_muxer -i original.webm -o fixed.webm
- load fixed.webm into DASH / your own player!
Hope it helped someone. It was quite difficult to google any information without the DASH keyword (i am not using DASH, only the same underlying technology - MSE) :)