How can I generate a video file directly from an F

2019-02-01 10:20发布

FFmpeg has a number of video generating filters, listed in the documentation as "video sources":

  • cellauto
  • color
  • mptestsrc
  • fei0r_src
  • life
  • nullsrc, rgbtestsrc, testsrc

Those are great for using with other filters like overlay, but is there any way that I can generate a movie consisting of just one of those video sources without any input video?

Something like:

ffmpeg -vf color=red" red_movie.mp4

Except that that errors out with At least one input file must be specified.

标签: video ffmpeg
2条回答
兄弟一词,经得起流年.
2楼-- · 2019-02-01 10:45

It looks like the options have changed slightly in recent versions.

To use the filter input sources, you have to:

  1. Set the input format to the libavfilter virtual device using: -f lavfi
  2. Set the filter input source using the -i flag (not -vf)
  3. Provide arguments as complete key-value pairs, like: color=color=red

This works for ffplay, too, to test your filtergraph: ffplay -f lavfi -i color

Examples

In these examples I've added -t 30 to specify that I only want 30 seconds of output.

Color (Red)

ffmpeg -f lavfi -i color=color=red -t 30 red.mp4
                     ^     ^    ^
                     |     |    |
                   filter key value

The key can be shortened to its abbreviated form: -i color=c=red

SMPTE Color Bars Pattern

ffmpeg -f lavfi -i smptebars -t 30 smpte.mp4

Test Source Pattern

ffmpeg -f lavfi -i testsrc -t 30 -pix_fmt yuv420p testsrc.mp4

In order for this to playback reliably, you might need to set the pixel format with: -pix_fmt yuv420p

By default, ffmpeg will use yuv444p (x264, High 4:4:4 Predictive), which some players aren't yet able to decode.

For instance, the video it creates is crashing VLC 2.0.7 and is just 30 seconds of black in QuickTime Player 10.2 (Mac OS X 10.8.4).

More info on test source here.

RGB Test Source

ffmpeg -f lavfi -i rgbtestsrc -pix_fmt yuv420p -t 30 rgbtestsrc.mp4

As with the last example, this might not work for you unless you set the pixel format to yuv420p as shown.

For posterity, here's the version I'm using:

ffmpeg version 1.2.1
libavutil      52. 18.100 / 52. 18.100
libavcodec     54. 92.100 / 54. 92.100
libavformat    54. 63.104 / 54. 63.104
libavdevice    54.  3.103 / 54.  3.103
libavfilter     3. 42.103 /  3. 42.103
libswscale      2.  2.100 /  2.  2.100
libswresample   0. 17.102 /  0. 17.102
libpostproc    52.  2.100 / 52.  2.100
查看更多
做个烂人
3楼-- · 2019-02-01 10:58

Though hinted at in the documentation, this isn't explicitly spelled out. I was pleased to figure it out, so I thought I'd share.

The key is to use the special format lavfi:

Libavfilter input virtual device.

This input device reads data from the open output pads of a libavfilter filtergraph.

For each filtergraph open output, the input device will create a corresponding stream which is mapped to the generated output. Currently only video data is supported. The filtergraph is specified through the option ‘graph’.

Essentially, lavfi format causes the input to be treated as a video filter instead of a filename.

Thus, to make a movie consisting of nothing but red, the command is:

ffmpeg -f lavfi -i color=red -frames:v 200 red_movie.mp4

(Specifying the number of frames or otherwise limiting the input is crucial as filters generally have no fixed "end" point, and will happily go on generating video forever.)

查看更多
登录 后发表回答