Suggestions for synchronizing audio with audio queue services?

I am decoding a video format in which there is an accompanying soundtrack in a separate file. According to the specifications, I shoot a video frame every 1 / 75th of a second. The length of the video file is the same as the length of the audio track.

I play audio with Audio Queue Services (which I chose because I decided that there would be situations where I need precise time control - this is exactly the situation I am facing!). This is a great API, and I haven’t made much progress with the sample code in the Apple Programming Guide (although I wrapped everything in a more convenient ObjC API).

In ideal situations, everything works great with basic playback settings. Video and audio remain synchronized and both end simultaneously (within my own ability to tell the difference). However, if the performance of hiccups (or I'm attaching Leaks Instrument or something else), they quickly go out of sync.

This is the first time I've ever written something like this: I have no prior experience with audio or video. I, of course, have no experience with Audio Queue Services. So I'm not sure where to go from here.

Did you do something like that? Do you have any advice or tips or tricks? Is there any fundamental documentation I need to read? Any help would be greatly appreciated.

+3
1

-, , . , HAL AUHAL, AudioQueue, AQ.

, , , - . , , , , .

, . , , , startTime, , , . , , , , , , .

+2

Source: https://habr.com/ru/post/1778493/


All Articles