I am not sure whether audio units can work as codecs in a streaming audio scenario on the iPhone.
I've read in various places that it can be done, but I haven't seen any examples or proper documentation for that. Instead, I find that most of the apps released have utilised ffmpeg and libmms.
I appreciate any help you can give me.
Audio Units are very low-level and are useful if you want to do some heavy audio processing like realtime audio effects. As far as I know Audio Units doesn't support the wma audio codec. You'll have to use the FFmpeg library for this.
Since FFmpeg also supports mms it's not necessary to use another library like libmms to connect to mms audio streams. You can connect to mms audio streams with FFmpeg like this:
const char *url = "mmst://somemmsurlhere.com";
avformat_open_input(formatCtx, url, NULL, NULL);
For decoding the audio data you can use the avcodec_decode_audio3 function. Once you have the decoded audio data ready I suggest you use the AudioQueue framework for playback. AudioQueue works by calling callback functions you've defined to ask you for audio data. One of these callback functions is the AudioQueueOutputCallback where you can pass the decoded audio data like this:
- (void)handlePlayCallback:(AudioQueueRef) inAudioQueue buffer:(AudioQueueBufferRef) inBuffer {
// copy decoded audio data to inBuffer->mAudioData
// and set the size of the copied data like this
// inBuffer->mAudioDataByteSize = data_written
if(inBuffer->mAudioDataByteSize > 0) {
AudioQueueEnqueueBuffer(inAudioQueue, inBuffer, 0, NULL);
}
}
Having looked at this again, it appears that iOS allows only for built-in audio units, unlike Mac OS X. Built-in audio units are described in "Audio Unit Hosting Guide for iOS" -> "Using Specific Audio Units", which can be found online here.