How do I reduce the video size captured by the def

2019-03-21 14:25发布

I am trying to reduce the video size captured by the default camera (it's generating high resolution video) in Android. Does FFMPEG have a property to encode a video with given resolution? I tried to Google, but all examples are using command line mode for FFMPEG.

My questions are:

  1. Can we use ffmpeg command line in Android?
  2. If not then how we will achieve it?
  3. Can we able record a video directly using ffmpeg in Android?
  4. Is there any other solution for this?

1条回答
beautiful°
2楼-- · 2019-03-21 15:06

Compiling ffmpeg for android is possible, as well as running ffmpeg from command line. There's no need to delve into native code and jni calls unless you need more advanced usage than what the command line provides.

For reference, this is the shell script I run to compile ffmpeg (run under Ubuntu, it makes things a lot easier than windows)

#!/bin/bash


ANDROID_API=android-3
export ANDROID_NDK=${HOME}/android-ndk
export ANDROID_SDK=${HOME}/android-sdk
SYSROOT=$ANDROID_NDK/platforms/$ANDROID_API/arch-arm
ANDROID_BIN=$ANDROID_NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/*-x86/bin/
CROSS_COMPILE=${ANDROID_BIN}/arm-linux-androideabi-
export PATH=$PATH:$ANDROID_SDK/tools:$ANDROID_SDK/platform-tools

export ARM_ROOT=${HOME}android-ndk
export ARM_INC=$ARM_ROOT/platforms/android-5/arch-arm/usr/include
export ARM_LIB=$ARM_ROOT/platforms/android-5/arch-arm/usr/lib
export LIB_INC=${HOME}/include
export LIB_LIB=${HOME}/lib
CFLAGS=" -I$ARM_INC -fPIC -DANDROID -fpic -mthumb-interwork -ffunction-sections -funwind-tables -fstack-protector -fno-short-enums -D__ARM_ARCH_5__ -D__ARM_ARCH_5T__ -D__ARM_ARCH_5E__ -D__ARM_ARCH_5TE__  -Wno-psabi -march=armv5te -mtune=xscale -msoft-float -mthumb -Os -fomit-frame-pointer -fno-strict-aliasing -finline-limit=64 -DANDROID  -Wa,--noexecstack -MMD -MP "
LDFLAGS=" -nostdlib -Bdynamic  -Wl,--no-undefined -Wl,-z,noexecstack  -Wl,-z,nocopyreloc -Wl,-soname,/system/lib/libz.so -Wl,-rpath-link=$ARM_LIB,-dynamic-linker=/system/bin/linker -L$ARM_LIB -nostdlib $ARM_LIB/crtbegin_dynamic.o $ARM_LIB/crtend_android.o -lc -lm -ldl -lgcc "

FLAGS="--target-os=linux --enable-cross-compile --cross-prefix=$CROSS_COMPILE --arch=arm --prefix=$HOME --disable-shared --enable-static --extra-libs=-static --extra-cflags=--static --enable-small --disable-asm --disable-yasm --disable-amd3dnow --disable-amd3dnowext --disable-mmx --disable-mmx2 --disable-sse --disable-ssse3 --disable-indevs"
export CFLAGS=$EXTRA_CFLAGS
export LDFLAGS=$EXTRA_LDFLAGS
./configure $FLAGS --extra-cflags="$CFLAGS" --extra-ldflags="$LDFLAGS" \
--cc="${CROSS_COMPILE}gcc --sysroot=${SYSROOT}" --extra-ldflags="$LDFLAGS" \
--cxx="${CROSS_COMPILE}g++ --sysroot=${SYSROOT}" \
--nm="${CROSS_COMPILE}nm" \
--ar="${CROSS_COMPILE}ar"
make clean
make -j4 || exit 1
make install || exit 1

As for running ffmpeg, first you need to copy ffmpeg into your application's files directory, chmod 755 it using getRuntime.exec() as shown below, then run ffmpeg with the following line:

Process p = Runtime.getRuntime().exec("/data/data/yourpackagename/files/ffmpeg -i in.mp4 out.mp4")

Now, getting the camera's input to ffmpeg in a format it can understand is the tough bit, which I'm still trying to figure out. I've got a stackoverflow question going on the topic: Decode android's hardware encoded H264 camera feed using ffmpeg in real time

查看更多
登录 后发表回答