Well, I solved my problem differently. The animation route did not work, so my solution was to compile all my inserted images into a temporary video file and use this video to insert the images into my final video.
Starting with the first link I originally posted ASSETWriterInput to create a video with UIImages on Iphone issues , I created the following function to create my temporary video
void CreateFrameImageVideo(NSString* path) { NSLog(@"Creating writer at path %@", path); NSError *error = nil; AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4 error:&error]; NSLog(@"Creating video codec settings"); NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:gVideoTrack.estimatedDataRate], AVVideoAverageBitRateKey, [NSNumber numberWithInt:gVideoTrack.nominalFrameRate],AVVideoMaxKeyFrameIntervalKey, AVVideoProfileLevelH264MainAutoLevel, AVVideoProfileLevelKey, nil]; NSLog(@"Creating video settings"); NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, codecSettings,AVVideoCompressionPropertiesKey, [NSNumber numberWithInt:1280], AVVideoWidthKey, [NSNumber numberWithInt:720], AVVideoHeightKey, nil]; NSLog(@"Creating writter input"); AVAssetWriterInput* writerInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain]; NSLog(@"Creating adaptor"); AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil]; [videoWriter addInput:writerInput]; NSLog(@"Starting session");
After creating the video, it is then downloaded as AVAsset, and its track is extracted, and then the video is inserted, replacing the next line (from the first block of code in the original message).
[mutableCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(CMTimeAdd(gFrames[i].startTime, timeOffset), gFrames[i].duration)];
from:
[mutableCompositionTrack insertTimeRange:CMTimeRangeMake(timeOffset,gAnalysisFrames[i].duration) ofTrack:gFramesTrack atTime:CMTimeAdd(gAnalysisFrames[i].startTime, timeOffset) error:&gError];
where gFramesTrack is an AVAssetTrack created from a time frame video.
all code related to the CALayer and CABasicAnimation objects was deleted because it just didn't work.
Not the smartest solution, I donβt think, but it at least works. I hope someone finds this helpful.
This code also works on iOS devices (tested using iPad 3)
Note: The DebugLog function from the first message is just a callback to a function that prints log messages; if necessary, they can be replaced with NSLog () calls.