I have built ffmpeg 0.8.12 (love) with the android NDK (r8c) on ubuntu. I then use the generated library in another android application through JNI.
Essentially what I want to do is pass a byte stream from java to my c jni function and use ffmpeg to decode it into a PCM audio buffer which will then be passed back to java to be played using Android's AudioTrack. I can successfully pass the buffer through to jni (have checked the values) and ffmpeg seems to initialise correctly, but when it tries to decode the first frame, it throws an error in the aac_decode_frame_int method in aacdec.c "channel element 0.0 is not allocated". The aac file plays fine and is valid.
Here is my jni code to do the decoding
jint Java_com_example_testffmpeg_MainActivity_decodeAacBytes(JNIEnv * env,
jobject this, jbyteArray input, jint numBytes) {
//copy bytes from java
jbyte* bufferPtr = (*env)->GetByteArrayElements(env, input, NULL);
uint8_t inputBytes[numBytes + FF_INPUT_BUFFER_PADDING_SIZE];
memset(inputBytes, 0, numBytes + FF_INPUT_BUFFER_PADDING_SIZE);
memcpy(inputBytes, bufferPtr, numBytes);
(*env)->ReleaseByteArrayElements(env, input, bufferPtr, 0);
av_register_all();
AVCodec *codec = avcodec_find_decoder(CODEC_ID_AAC);
if (codec == NULL) {
LOGE("Cant find AAC codec\n");
return 0;
}
LOGI("AAC codec found\n");
AVCodecContext *avCtx = avcodec_alloc_context();
if (avCtx == NULL) {
LOGE("Could not allocate codec context\n");
return 0;
}
LOGI("codec context allocated\n");
if (avcodec_open2(avCtx, codec, NULL) < 0) {
LOGE("Could not open codec\n");
return 0;
}
LOGI("AAC codec opened");
//the input buffer
AVPacket avPacket;
av_init_packet(&avPacket);
LOGI("AVPacket initialised\n");
avPacket.size = numBytes; //input buffer size
avPacket.data = inputBytes; // the input buffer
int outSize;
int len;
uint8_t *outbuf = malloc(AVCODEC_MAX_AUDIO_FRAME_SIZE);
while (avPacket.size > 0) {
outSize = AVCODEC_MAX_AUDIO_FRAME_SIZE;
len = avcodec_decode_audio3(avCtx, (short *) outbuf, &outSize,
&avPacket);
if (len < 0) {
LOGE("Error while decoding\n");
return 0;
}
if (outSize > 0) {
LOGI("Decoded some stuff\n");
}
avPacket.size -= len;
avPacket.data += len;
}
LOGI("Freeing memory\n");
av_free_packet(&avPacket);
avcodec_close(avCtx);
av_free(avCtx);
return 0;
}
The problem occurs in the call to avcodec_decode_audio3, when the decoding first occurs. I have stepped through the ffmpeg code, but can't find the problem. Any help would be greatly appreciated!
You must set some additional settings for
AVCodecContext
before you callavcodec_open2
.I usually set these required settings (variables beginning with 'k' mean predefined constatns):
UPD
I'm sorry, I wrote the parameters that must be set for audio encoding. For audio decoding usually sufficient to set
avCtx->channels
,ctx->sample_rate
or setavCtx->extrdata
andavCtx->extradata_size
.To find the cause of the error, try to see the ffmpeg output. If on the device is difficult to do, you can redirect the ffmpeg output and perform logging by own callback. Example: