is it possible to send ffmpeg images by using pipe

2020-06-27 04:15发布

I want to send images as input to ffmpeg and I want ffmpeg to output video to a stream (webRtc format.)

I found some information that from my understanding showed this is possible. - I believe that ffmpeg could receive image from a pipe, does anyone know how this can be done ?

标签: ffmpeg
1条回答
贪生不怕死
2楼-- · 2020-06-27 04:30

"I want to send images as input to FFmpeg... I believe that FFmpeg could receive image from a pipe, does anyone know how this can be done?"

Yes it's possible to send FFmpeg images by using a pipe. Use the standardInput to send frames. The frame data must be uncompressed pixel values (eg: 24bit RGB format) in a byte array that holds enough bytes (widthxheightx3) to write a full frame.

Normally (in Command or Terminal window) you set input and output as:

ffmpeg -i inputvid.mp4 outputvid.mp4.

But for pipes you must first specify the incoming input's width/height and frame rate etc. Then aso add incoming input filename as -i - (where by using a blank - this means FFmpeg watches the standardInput connection for incoming raw pixel data.

You must put your frame data into some Bitmap object and send the bitmap values as byte array. Each send will be encoded as a new video frame. Example pseudo-code :

public function makeVideoFrame ( frame_BMP:Bitmap ) : void
{
    //# Encodes the byte array of a Bitmap object as FFmpeg video frame
    if ( myProcess.running == true )
    {
        Frame_Bytes = frame_BMP.getBytes(); //# read pixel values to a byte array
        myProcess.standardInput.writeBytes(Frame_Bytes); //# Send data to FFmpeg for new frame encode

        Frame_Bytes.clear(); //# empty byte array for re-use with next frame

    }
}

Anytime you update your bitmap with new pixel information, you can write that as a new frame by sending that bitmap as input parameter to the above function eg makeVideoFrame (my_new_frame_BMP);.

Your pipe's Process must start with these arguments:

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - ....etc

Where...

  • -f rawvideo -pix_fmt argb means accept uncompressed RGB data.

  • -s 800x600 and -r 25 are example input width & height, r sets frame rate meaning FFmpeg must encode this amount of images per one second of output video.

The full setup looks like this:

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_vid.h264

If you get blocky video output try setting two output files...

-y -f rawvideo -pix_fmt argb -s 800x600 -r 25 -i - -c:v libx264 -profile:v baseline -level:v 3 -b:v 2500 -an out_tempData.h264 out_vid.h264

This will output a test h264 video file which you can later put inside an MP4 container.
The audio track -i someTrack.mp3 is optional.

-i myH264vid.h264 -i someTrack.mp3 outputVid.mp4
查看更多
登录 后发表回答