Good day,
I have two mp4 files (a.mp4 and b.mp4), each of them includes video and audio streams, and I need to concatenate them into a single mp4 container (c.mp4) using gstreamer (this question is connected to the previous one)
In other words, the following pipeline concatenates the content of a.mp4 and b.mp4 and then outputs the result into autovideosink and alsasink:
GST_DEBUG=3 gst-launch-1.0 concat name=c2 ! videoconvert ! videorate ! autovideosink concat name=c ! audioconvert ! audiorate ! alsasink filesrc location=a.mp4 ! decodebin name=d1 ! audio/x-raw ! queue ! c. filesrc location=b.mp4 ! decodebin name=d2 ! audio/x-raw ! queue ! c. d1. ! video/x-raw ! queue ! c2. d2. ! video/x-raw ! queue ! c2.
Works like a charm! But instead of autovideosink and alsasink, I need to re-encode and then mux concatenated video and audio streams and write them into a single container (i.e. there should be a single "filesink location=c.mp4" in the pipeline if I understand it correctly) - this way I would receive the content of a.mp4 appended with the content of b.mp4 (a.mp4 + b.mp4 = c.mp4)
Could someone please share a pipeline which demonstrates how to do this ?
Ok at least you mentioned filesink .. but you should always post something that you have (some not yet working pipe..) anyway here is the magic pipe:
gst-launch-1.0 -e concat name=c2 ! videoconvert ! x264enc tune=4 ! mp4mux name=mp4 ! filesink location=out.mp4 concat name=c ! audioconvert ! voaacenc ! mp4. filesrc location=big.mp4 ! decodebin name=d1 ! audio/x-raw ! queue ! c. filesrc location=big2.mp4 ! decodebin name=d2 ! audio/x-raw ! queue ! c. d1. ! video/x-raw ! queue ! c2. d2. ! video/x-raw ! queue ! c2.
btw you may want to read something about gst-launch
Please notice few things:
1, there is -e switch for gst-launch which causes to send EOS to the pipe which ends the mp4muxing process properly.. otherwise metadata will not be written
2, The pipe does not end automatically.. this is something which you can tune up.. maybe some attribute for concat or something.. Maybe try to add streamsynchronizer - but I was not successful with that one. I am not sure if I shuld put it after concat or before.. maybe you can ask on IRC
3, How do I build such pipe?
A, First I checked the capabilities of mp4mux as I knew I want to mux mp4.. (to find this you may type gst-inspect-1.0 | grep mp4 | grep mux
if you are on Linux). You must remember that src is the output of element and sink is the input (sometimes its not very natural to think this way.. just remember sink is that thing where water ends whend you wash your hands :D ). So we expect that there is sink for audio and sink for video..
B, There are few possibilities for audio I choosed audio/mpeg... then I rememberd that mp4 uses aac sometimes.. then I searched for aac encoder which is voaacenc.. I checked the src caps and indeed its audio/mpeg (version 4)
C, For video there is video/x-h264 which I like most .. so I took good old x264enc which I use all the time for video.. I thought that maybe I need to have h264parse but its not needed..
4, Then bundle everything together.. just remember that you can give elements names in a way of using name=something
(its exactly like alias) but you do not just use it as something
but you need to pu dot afterwards so its something.
5, Also remember that in what order you put elements into gst-launch is more about linking than how is it processed.. think about it that all you need is to link the elements .. and just then think of the processing itself. You can put *sink elements at the beginning of the pipeline but you must then give them name and use that name elsewhere (for example after mux as I did)
6, To simplify it.. input processing then decodebin which spawns two branches - audio and video.. each type goes to proper concat.. there are two concats - each have its own type of processing (for video there is videoconvert etc) .. then those two concat branches goes through encoding and after encoding they end at mp4mux.. after mux there is just filesink.. thats all