I am trying to implement linear convolution using Core Audio, I have the algorithm implemented and working, but I am trying to write the output of this into a .wav audio file. Here is the code for the algorithm...
//Create array containing output of convolution (size of array1 + array2 - 1)
float *COutput;
COutput = (float *)malloc(((size1+size2)-1)* sizeof(float));
int sizeOutput = ((size1 + size2)-1);
//Convolution Algorithm!!!
for (i=0; i<sizeOutput; i++) {
COutput[i]=0;
for (j=0; j<sizeCArray1; j++) {
if (((i-j)+1) > 0) {
COutput[i] += CArray1[i - j] * CArray2[j];
}
}
}
I need to write the float values within COutput (a standard array of floats) into an audio file. Am I right in assuming I need to send these float values to an AudioBuffer within an AudioBufferList initially? Or is there a simple way of doing this?
Many thanks for any help or guidance!
This is a late answer, but I struggled to get a
float *
buffer to write to a file in Swift.Posting this example in case it helps someone.
The free DiracLE time stretching library ( http://dirac.dspdimension.com ) has utility code that converts ABLs (AudioBufferLists) into float arrays and vice-versa as part of their example code. Check out their EAFRead and EAFWrite classes, they're exactly what you're looking for.
Yes, I'd put them in an AudioBuffer and that into an AudioBufferList. After that you can write them to a file using ExtAudioFileWrite() on a ExtAudioFileRef that was created using ExtAudioFileCreateNew().
Audio File documentation: http://developer.apple.com/library/mac/#documentation/MusicAudio/Reference/ExtendedAudioFileServicesReference/Reference/reference.html%23//apple_ref/doc/uid/TP40007912