What do you mean by "high precision"?
Although the documents claim that AVAssetReader not intended for use in real time, in practice I had no problems reading real-time video using it (cf fooobar.com/questions/123654 / ... ). The returned frames have a โpresentation timestamp,โ which you can get with CMSampleBufferGetPresentationTimeStamp .
You want one part of the project to be the master of the timekeeper here. Assuming your CALayer animation CALayer quickly computed and does not include potentially blocking things like disk access, I would use this as the main source of time. When you need to draw content (for example, in the draw selector in a subclass of UIView ), you should read currentTime from the CALayer animation, if necessary, continue using AVAssetReader video frames using copyNextSampleBuffer , until CMSampleBufferGetPresentationTimeStamp returns> = currentTime , draw a frame, and then draw the content CALayer animations on top.
source share