MIKMIDI does not include the ability to do this without doing most of this. However, it is possible. In essence, you need to create your own Core Audio GPU, which includes a MIDI synthesizer. Then, in the render callback for the toolbox, you pull events from MIKMIDISequence based on timestamps calculated using the AudioTimeStampnumber of frames passed to the render callback. You play them in the dashboard with MusicDeviceMIDIEvent().
You also need a way to get synthesized output. You can do this by adding general output to the AU graph, adding a rendering callback to it, then into this rendering callback, passing in AudioBufferListand writing it to a file (e.g. using ExtAudioFileWrite()).
, MIKMIDI, , . , - . EDIT: , , MIKMIDI. . .
API , MIKMIDI.