Core Audio - Interapp Audio - How to get output audio packets from a Node application inside a host application?

I am writing an application HOSTthat uses Core Audio technology for iOS 7 Inter App Audio to pull audio from a single generator NODEand run it in my application HOST. To do this, I use the services infrastructure of the audio component components and the Audio Unit components.

I want to achieve a connection to an external node application that can generate sound. I want this sound to be routed to my host application, and for my host application, I can directly access the audio packets as a stream of source audio data.

I wrote code inside my application HOSTthat does the following in sequence:

  • Establishes and activates an audio session with the appropriate session category.
  • Updates the list of sound-compatible applications that are of type kAudioUnitType_RemoteGeneratoror kAudioUnitType_RemoteInstrument(I'm not interested in effects applications).
  • Pulls the last object from this list and tries to establish a connection using AudioComponentInstanceNew()
  • Sets the basic description of Audio Stream that an audio format is required for my host application.
  • Sets the properties and callbacks of the audio unit, as well as the callback for rendering the sound block in the output area (bus).
  • Initializes an audio device.

, , , . , node? , AudioUnitRender(), node, ? , AudioUnitRender() , , . AudioUnitRender() "node"?

, HOST.

static OSStatus MyAURenderCallback (void                        *inRefCon,
                                    AudioUnitRenderActionFlags  *ioActionFlags,
                                    const AudioTimeStamp        *inTimeStamp,
                                    UInt32                      inBusNumber,
                                    UInt32                      inNumberFrames,
                                    AudioBufferList             *ioData) 
{
     //Do something here with the audio data? 
     //This method is never being called? 
     //Do I need to puts AudioUnitRender() in here? 
}

    - (void)start
    {
        [self configureAudioSession];
        [self refreshAUList];
    }

    - (void)configureAudioSession
    {
        NSError *audioSessionError = nil;
        AVAudioSession *mySession = [AVAudioSession sharedInstance];
        [mySession setPreferredSampleRate: _graphSampleRate error: &audioSessionError];
        [mySession setCategory: AVAudioSessionCategoryPlayAndRecord error: &audioSessionError];
        [mySession setActive: YES error: &audioSessionError];
        self.graphSampleRate = [mySession sampleRate];
    }

    - (void)refreshAUList
    {
        _audioUnits = @[].mutableCopy;

        AudioComponentDescription searchDesc = { 0, 0, 0, 0, 0 }, foundDesc;
        AudioComponent comp = NULL;

        while (true) {

        comp = AudioComponentFindNext(comp, &searchDesc);

        if (comp == NULL) break;

        if (AudioComponentGetDescription(comp, &foundDesc) != noErr) continue;

        if (foundDesc.componentType == kAudioUnitType_RemoteGenerator || foundDesc.componentType == kAudioUnitType_RemoteInstrument) {

            RemoteAU *rau = [[RemoteAU alloc] init];
            rau->_desc = foundDesc;
            rau->_comp = comp;

            AudioComponentCopyName(comp, &rau->_name);
            rau->_image = AudioComponentGetIcon(comp, 48);
            rau->_lastActiveTime = AudioComponentGetLastActiveTime(comp);

            [_audioUnits addObject:rau];
        }
    }

    [self connect];
}

- (void)connect  {

    if ([_audioUnits count] <= 0) {
        return;
    }

    RemoteAU *rau = [_audioUnits lastObject];

    AudioUnit myAudioUnit;

    //Node application will get launched in background
    Check(AudioComponentInstanceNew(rau->_comp, &myAudioUnit));

    AudioStreamBasicDescription format = {0};
    format.mChannelsPerFrame  = 2;
    format.mSampleRate = [[AVAudioSession sharedInstance] sampleRate];
    format.mFormatID = kAudioFormatMPEG4AAC;
    UInt32 propSize = sizeof(format);
    Check(AudioFormatGetProperty(kAudioFormatProperty_FormatInfo, 0, NULL, &propSize, &format));

    //Output format from node to host
    Check(AudioUnitSetProperty(myAudioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 0, &format, sizeof(format)));

    //Setup a render callback to the output scope of the audio unit representing the node app
    AURenderCallbackStruct callbackStruct = {0};
    callbackStruct.inputProc = MyAURenderCallback;
    callbackStruct.inputProcRefCon = (__bridge void *)(self);
    Check(AudioUnitSetProperty(myAudioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Output, 0, &callbackStruct, sizeof(callbackStruct)));

    //setup call backs
    Check(AudioUnitAddPropertyListener(myAudioUnit, kAudioUnitProperty_IsInterAppConnected, IsInterappConnected, NULL));
    Check(AudioUnitAddPropertyListener(myAudioUnit, kAudioOutputUnitProperty_HostTransportState, AudioUnitPropertyChangeDispatcher, NULL));

    //intialize the audio unit representing the node application
    Check(AudioUnitInitialize(myAudioUnit));
}
+4

Source: https://habr.com/ru/post/1540340/


All Articles