Audiograph output unit callback function - cannot read data

I am trying to capture an input data stream into an output block in an audio plot so that I can write it to a file. I registered the callback function for the output block (default output) after creating the graph as follows:

AudioComponent comp = AudioComponentFindNext (NULL, &cd); if (comp == NULL) { printf ("can't get output unit"); exit (-1); } CheckError (AudioComponentInstanceNew(comp, &player->outputUnit), "Couldn't open component for outputUnit"); // outputUnit is of type AudioUnit // register render callback AURenderCallbackStruct input; input.inputProc = MyRenderProc; input.inputProcRefCon = player; CheckError(AudioUnitSetProperty(player->outputUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Input, 0, &input, sizeof(input)), "AudioUnitSetProperty failed"); // initialize unit CheckError (AudioUnitInitialize(player->outputUnit),"Couldn't initialize output unit"); 

The callback function is called, but when I try to read the input stream from ioData-> mBuffers [0] .mData, all I get is zeros. Here is the callback function:

 OSStatus MyRenderProc(void *inRefCon, // PLAYER CODE AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { int frame = 0; Float32 leftFloat = 0; Float32 rightFloat = 0; NSNumber *leftNumber; NSNumber *rightNumber; for (frame = 0; frame < inNumberFrames; ++frame) { Float32 *data = (Float32*)ioData->mBuffers[0].mData; leftFloat = (data)[frame]; leftNumber = [NSNumber numberWithDouble:leftFloat]; (data)[frame] = 0; // copy to right channel too data = (Float32*)ioData->mBuffers[1].mData; rightFloat = (data)[frame]; rightNumber = [NSNumber numberWithDouble:(data)[frame]]; (data)[frame] = 0; [leftData addObject:leftNumber]; [rightData addObject:rightNumber]; } return noErr; } 

Also, if I do not reset the data, I hear a noise during playback, which tells me that I am misinterpreting the mBuffers function. What am I doing wrong here?

+1
source share
2 answers

If capturing input from AUGraph is a task, the critical part of the code (more or less) comes down to this simplest single-channel demo:

 OSStatus MyRenderProc(void *inRefCon, AudioUnitRenderActionFlags *ioActionFlags, const AudioTimeStamp *inTimeStamp, UInt32 inBusNumber, UInt32 inNumberFrames, AudioBufferList * ioData) { Float32 buf [inNumberFrames]; //just for one channel! MyMIDIPlayer *player = (MyMIDIPlayer *)inRefCon; if (*ioActionFlags & kAudioUnitRenderAction_PostRender){ static int TEMP_kAudioUnitRenderAction_PostRenderError = (1 << 8); if (!(*ioActionFlags & TEMP_kAudioUnitRenderAction_PostRenderError)){ Float32* data = (Float32 *)ioData->mBuffers[0].mData; //just for one channel! memcpy(buf, data, inNumberFrames*sizeof(Float32)); //do something with buff - there are nice examples of ExtAudioFileWriteAsync() } } return noErr; } 

In setupAUGraph() this callback can be configured as follows:

 void setupAUGraph(MyMIDIPlayer *player) { // the beginning follows the textbook example setup pattern {… … …} // this is the specific part AURenderCallbackStruct input = {0}; input.inputProc = MyRenderProc; input.inputProcRefCon = player->instrumentUnit; CheckError(AudioUnitAddRenderNotify(player->instrumentUnit, MyRenderProc, &player), "AudioUnitAddRenderNotify Failed"); // now initialize the graph (causes resources to be allocated) CheckError(AUGraphInitialize(player->graph), "AUGraphInitialize failed"); } 

Note that the render callback β€œpicks up” the connection between the output of the node device and the input of the node output, capturing what comes from the upstream. The callback simply copies ioData to another buffer that can be saved. AFAIK, this is the easiest way to access ioData , which, as I know, works without breaking the API.

Also pay attention to very effective simple methods for testing, if this works for a specific implementation - there is no need for Objective-C methods inside the callback. Merging with some NSArray s, adding objects, etc. Inside a real-time callback, C introduces the risk of priority problems, which can later become difficult to debug. The CoreAudio API is written at plain-C . At the heart of Obj-C execution, a lot of things do not happen in the real-time stream without risking glitches (locks, memory management, etc.). Thus, it would be safer to keep Obj-C in the real-time stream .
Hope this helps.

0
source

The Audio Unit output rendering callback is not intended to capture output. It should provide output (sample mBuffers) for the block being played.

0
source

Source: https://habr.com/ru/post/917298/


All Articles