How to encode using the FFMpeg in Android (using H

2020-06-28 08:53发布

I am trying to follow the sample code on encoding in the ffmpeg document and successfully build a application to encode and generate a mp4 file but I face the following problems:

1) I am using the H263 for encoding but I can only set the width and height of the AVCodecContext to 176x144, for other case (like 720x480 or 640x480) it will return fail.

2) I can't play the output mp4 file by using the default Android player, isn't it support H263 mp4 file? p.s. I can play it by using other player

3) Is there any sample code on encoding other video frame to make a new video (which mean decode the video and encode it back in different quality setting, also i would like to modify the frame content)?

Here is my code, thanks!

JNIEXPORT jint JNICALL Java_com_ffmpeg_encoder_FFEncoder_nativeEncoder(JNIEnv* env, jobject thiz, jstring filename){

LOGI("nativeEncoder()");

avcodec_register_all();
avcodec_init();
av_register_all();

AVCodec         *codec;
AVCodecContext  *codecCtx;
int             i;
int             out_size;
int             size;
int             x;
int             y;
int             output_buffer_size;
FILE            *file;
AVFrame         *picture;
uint8_t         *output_buffer;
uint8_t         *picture_buffer;

/* Manual Variables */
int             l;
int             fps = 30;
int             videoLength = 5;

/* find the H263 video encoder */
codec = avcodec_find_encoder(CODEC_ID_H263);
if (!codec) {
    LOGI("avcodec_find_encoder() run fail.");
}

codecCtx = avcodec_alloc_context();
picture = avcodec_alloc_frame();

/* put sample parameters */
codecCtx->bit_rate = 400000;
/* resolution must be a multiple of two */
codecCtx->width = 176;
codecCtx->height = 144;
/* frames per second */
codecCtx->time_base = (AVRational){1,fps};
codecCtx->pix_fmt = PIX_FMT_YUV420P;
codecCtx->codec_id = CODEC_ID_H263;
codecCtx->codec_type = AVMEDIA_TYPE_VIDEO;

/* open it */
if (avcodec_open(codecCtx, codec) < 0) {
    LOGI("avcodec_open() run fail.");
}

const char* mfileName = (*env)->GetStringUTFChars(env, filename, 0);

file = fopen(mfileName, "wb");
if (!file) {
    LOGI("fopen() run fail.");
}

(*env)->ReleaseStringUTFChars(env, filename, mfileName);

/* alloc image and output buffer */
output_buffer_size = 100000;
output_buffer = malloc(output_buffer_size);

size = codecCtx->width * codecCtx->height;
picture_buffer = malloc((size * 3) / 2); /* size for YUV 420 */

picture->data[0] = picture_buffer;
picture->data[1] = picture->data[0] + size;
picture->data[2] = picture->data[1] + size / 4;
picture->linesize[0] = codecCtx->width;
picture->linesize[1] = codecCtx->width / 2;
picture->linesize[2] = codecCtx->width / 2;

for(l=0;l<videoLength;l++){
    //encode 1 second of video
    for(i=0;i<fps;i++) {
        //prepare a dummy image YCbCr
        //Y
        for(y=0;y<codecCtx->height;y++) {
            for(x=0;x<codecCtx->width;x++) {
                picture->data[0][y * picture->linesize[0] + x] = x + y + i * 3;
            }
        }

        //Cb and Cr
        for(y=0;y<codecCtx->height/2;y++) {
            for(x=0;x<codecCtx->width/2;x++) {
                picture->data[1][y * picture->linesize[1] + x] = 128 + y + i * 2;
                picture->data[2][y * picture->linesize[2] + x] = 64 + x + i * 5;
            }
        }

        //encode the image
        out_size = avcodec_encode_video(codecCtx, output_buffer, output_buffer_size, picture);
        fwrite(output_buffer, 1, out_size, file);
    }

    //get the delayed frames
    for(; out_size; i++) {
        out_size = avcodec_encode_video(codecCtx, output_buffer, output_buffer_size, NULL);
        fwrite(output_buffer, 1, out_size, file);
    }
}

//add sequence end code to have a real mpeg file
output_buffer[0] = 0x00;
output_buffer[1] = 0x00;
output_buffer[2] = 0x01;
output_buffer[3] = 0xb7;

fwrite(output_buffer, 1, 4, file);
fclose(file);
free(picture_buffer);
free(output_buffer);
avcodec_close(codecCtx);
av_free(codecCtx);
av_free(picture);

LOGI("finish");

return 0; }

2条回答
Luminary・发光体
2楼-- · 2020-06-28 09:40

The code supplied in the question (I used it myself at first) seems to only generate a very rudimentary, if any, container format. I found that this example, http://cekirdek.pardus.org.tr/~ismail/ffmpeg-docs/output-example_8c-source.html, worked much better as it creates a real container for the video and audio streams. My video is now displayable on the Android device.

查看更多
神经病院院长
3楼-- · 2020-06-28 09:49

H263 accepts only certain resolutions:

128 x 96
176 x 144
352 x 288
704 x 576
1408 x 1152

It will fail with anything else.

查看更多
登录 后发表回答