I want to send images as input to ffmpeg and I want ffmpeg to output video to a stream (webRtc format.)
I found some information that from my understanding showed this is possible. - I believe that ffmpeg could receive image from a pipe, does anyone know how this can be done ?
Yes it's possible to send FFmpeg images by using a pipe. Use the
standardInput
to send frames. The frame data must be uncompressed pixel values (eg: 24bit RGB format) in a byte array that holds enough bytes (width
xheight
x3
) to write a full frame.Normally (in Command or Terminal window) you set input and output as:
ffmpeg
-i inputvid.mp4 outputvid.mp4
.But for pipes you must first specify the incoming input's width/height and frame rate etc. Then aso add incoming input filename as
-i -
(where by using a blank-
this means FFmpeg watches thestandardInput
connection for incoming raw pixel data.You must put your frame data into some Bitmap object and send the bitmap values as byte array. Each send will be encoded as a new video frame. Example pseudo-code :
Anytime you update your bitmap with new pixel information, you can write that as a new frame by sending that bitmap as input parameter to the above function eg
makeVideoFrame (my_new_frame_BMP);
.Your pipe's Process must start with these arguments:
Where...
-f rawvideo -pix_fmt argb
means accept uncompressed RGB data.-s 800x600
and-r 25
are example input width & height,r
sets frame rate meaning FFmpeg must encode this amount of images per one second of output video.The full setup looks like this:
If you get blocky video output try setting two output files...
This will output a test
h264
video file which you can later put inside an MP4 container.The audio track
-i someTrack.mp3
is optional.