Mixing Images and Videos with AVFoundation

I am trying to merge images into an existing video to create a new video using AVFoundation on Mac.

So far I have read an example of Apple documentation

ASSETWriterInput to create video with UIImages on Iphone issues

Mix video with still image in CALayer using AVVideoCompositionCoreAnimationTool

AVFoundation Tutorial: Add Labels and Animations to Videos and Several Other SO Links

Now they have been very useful from time to time, but my problem is that I do not create a static watermark or overlay exactly, I want to put images between parts of the video. So far I have managed to get the video and create empty sections for these images that need to be inserted and exported.

My problem is for the images to insert themselves into these empty sections. The only way I can do this is to create a series of layers that are animated to change their opacity at the right times, but I can't get the animation to work.

Below is the code that I use to create video segments and animate layers.

//https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_Editing.html#//apple_ref/doc/uid/TP40010188-CH8-SW7 // let start by making our video composition AVMutableComposition* mutableComposition = [AVMutableComposition composition]; AVMutableCompositionTrack* mutableCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; AVMutableVideoComposition* mutableVideoComposition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:gVideoAsset]; // if the first point frame doesn't start on 0 if (gFrames[0].startTime.value != 0) { DebugLog("Inserting vid at 0"); // then add the video track to the composition track with a time range from 0 to the first point startTime [mutableCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, gFrames[0].startTime) ofTrack:gVideoTrack atTime:kCMTimeZero error:&gError]; } if(gError) { DebugLog("Error inserting original video segment"); GetError(); } // create our parent layer and video layer CALayer* parentLayer = [CALayer layer]; CALayer* videoLayer = [CALayer layer]; parentLayer.frame = CGRectMake(0, 0, 1280, 720); videoLayer.frame = CGRectMake(0, 0, 1280, 720); [parentLayer addSublayer:videoLayer]; // create an offset value that should be added to each point where a new video segment should go CMTime timeOffset = CMTimeMake(0, 600); // loop through each additional frame for(int i = 0; i < gFrames.size(); i++) { // create an animation layer and assign it content to the CGImage of the frame CALayer* Frame = [CALayer layer]; Frame.contents = (__bridge id)gFrames[i].frameImage; Frame.frame = CGRectMake(0, 720, 1280, -720); DebugLog("inserting empty time range"); // add frame point to the composition track starting at the point start time // insert an empty time range for the duration of the frame animation [mutableCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(CMTimeAdd(gFrames[i].startTime, timeOffset), gFrames[i].duration)]; // update the time offset by the duration timeOffset = CMTimeAdd(timeOffset, gFrames[i].duration); // make the layer completely transparent Frame.opacity = 0.0f; // create an animation for setting opacity to 0 on start CABasicAnimation* frameAnim = [CABasicAnimation animationWithKeyPath:@"opacity"]; frameAnim.duration = 1.0f; frameAnim.repeatCount = 0; frameAnim.autoreverses = NO; frameAnim.fromValue = [NSNumber numberWithFloat:0.0]; frameAnim.toValue = [NSNumber numberWithFloat:0.0]; frameAnim.beginTime = AVCoreAnimationBeginTimeAtZero; frameAnim.speed = 1.0f; [Frame addAnimation:frameAnim forKey:@"animateOpacity"]; // create an animation for setting opacity to 1 frameAnim = [CABasicAnimation animationWithKeyPath:@"opacity"]; frameAnim.duration = 1.0f; frameAnim.repeatCount = 0; frameAnim.autoreverses = NO; frameAnim.fromValue = [NSNumber numberWithFloat:1.0]; frameAnim.toValue = [NSNumber numberWithFloat:1.0]; frameAnim.beginTime = AVCoreAnimationBeginTimeAtZero + CMTimeGetSeconds(gFrames[i].startTime); frameAnim.speed = 1.0f; [Frame addAnimation:frameAnim forKey:@"animateOpacity"]; // create an animation for setting opacity to 0 frameAnim = [CABasicAnimation animationWithKeyPath:@"opacity"]; frameAnim.duration = 1.0f; frameAnim.repeatCount = 0; frameAnim.autoreverses = NO; frameAnim.fromValue = [NSNumber numberWithFloat:0.0]; frameAnim.toValue = [NSNumber numberWithFloat:0.0]; frameAnim.beginTime = AVCoreAnimationBeginTimeAtZero + CMTimeGetSeconds(gFrames[i].endTime); frameAnim.speed = 1.0f; [Frame addAnimation:frameAnim forKey:@"animateOpacity"]; // add the frame layer to our parent layer [parentLayer addSublayer:Frame]; gError = nil; // if there another point after this one if( i < gFrames.size()-1) { // add our video file to the composition with a range of this point end and the next point start [mutableCompositionTrack insertTimeRange:CMTimeRangeMake(gFrames[i].startTime, CMTimeMake(gFrames[i+1].startTime.value - gFrames[i].startTime.value, 600)) ofTrack:gVideoTrack atTime:CMTimeAdd(gFrames[i].startTime, timeOffset) error:&gError]; } // else just add our video file with a range of this points end point and the videos duration else { [mutableCompositionTrack insertTimeRange:CMTimeRangeMake(gFrames[i].startTime, CMTimeSubtract(gVideoAsset.duration, gFrames[i].startTime)) ofTrack:gVideoTrack atTime:CMTimeAdd(gFrames[i].startTime, timeOffset) error:&gError]; } if(gError) { char errorMsg[256]; sprintf(errorMsg, "Error inserting original video segment at: %d", i); DebugLog(errorMsg); GetError(); } } 

Now in this segment the frame opacity is set to 0.0f, however, when I set it to 1.0f, all it does is just put the last of these frames on top of the video for the duration.

After that, the video is exported using AVAssetExportSession, as shown below.

 mutableVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; // create a layer instruction for our newly created animation tool AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:gVideoTrack]; AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; [instruction setTimeRange:CMTimeRangeMake(kCMTimeZero, [mutableComposition duration])]; [layerInstruction setOpacity:1.0f atTime:kCMTimeZero]; [layerInstruction setOpacity:0.0f atTime:mutableComposition.duration]; instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction]; // set the instructions on our videoComposition mutableVideoComposition.instructions = [NSArray arrayWithObject:instruction]; // export final composition to a video file // convert the videopath into a url for our AVAssetWriter to create a file at NSString* vidPath = CreateNSString(outputVideoPath); NSURL* vidURL = [NSURL fileURLWithPath:vidPath]; AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition presetName:AVAssetExportPreset1280x720]; exporter.outputFileType = AVFileTypeMPEG4; exporter.outputURL = vidURL; exporter.videoComposition = mutableVideoComposition; exporter.timeRange = CMTimeRangeMake(kCMTimeZero, mutableComposition.duration); // Asynchronously export the composition to a video file and save this file to the camera roll once export completes. [exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ if (exporter.status == AVAssetExportSessionStatusCompleted) { DebugLog("!!!file created!!!"); _Close(); } else if(exporter.status == AVAssetExportSessionStatusFailed) { DebugLog("failed damn"); DebugLog(cStringCopy([[[exporter error] localizedDescription] UTF8String])); DebugLog(cStringCopy([[[exporter error] description] UTF8String])); _Close(); } else { DebugLog("NoIdea"); _Close(); } }); }]; } 

It seems to me that the animation does not start, but I do not know. Am I going to merge image data in a video like this correctly?

Any help would be greatly appreciated.

+5
source share
1 answer

Well, I solved my problem differently. The animation route did not work, so my solution was to compile all my inserted images into a temporary video file and use this video to insert the images into my final video.

Starting with the first link I originally posted ASSETWriterInput to create a video with UIImages on Iphone issues , I created the following function to create my temporary video

 void CreateFrameImageVideo(NSString* path) { NSLog(@"Creating writer at path %@", path); NSError *error = nil; AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL: [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4 error:&error]; NSLog(@"Creating video codec settings"); NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:gVideoTrack.estimatedDataRate/*128000*/], AVVideoAverageBitRateKey, [NSNumber numberWithInt:gVideoTrack.nominalFrameRate],AVVideoMaxKeyFrameIntervalKey, AVVideoProfileLevelH264MainAutoLevel, AVVideoProfileLevelKey, nil]; NSLog(@"Creating video settings"); NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: AVVideoCodecH264, AVVideoCodecKey, codecSettings,AVVideoCompressionPropertiesKey, [NSNumber numberWithInt:1280], AVVideoWidthKey, [NSNumber numberWithInt:720], AVVideoHeightKey, nil]; NSLog(@"Creating writter input"); AVAssetWriterInput* writerInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings] retain]; NSLog(@"Creating adaptor"); AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil]; [videoWriter addInput:writerInput]; NSLog(@"Starting session"); //Start a session: [videoWriter startWriting]; [videoWriter startSessionAtSourceTime:kCMTimeZero]; CMTime timeOffset = kCMTimeZero;//CMTimeMake(0, 600); NSLog(@"Video Width %d, Height: %d, writing frame video to file", gWidth, gHeight); CVPixelBufferRef buffer; for(int i = 0; i< gAnalysisFrames.size(); i++) { while (adaptor.assetWriterInput.readyForMoreMediaData == FALSE) { NSLog(@"Waiting inside a loop"); NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1]; [[NSRunLoop currentRunLoop] runUntilDate:maxDate]; } //Write samples: buffer = pixelBufferFromCGImage(gAnalysisFrames[i].frameImage, gWidth, gHeight); [adaptor appendPixelBuffer:buffer withPresentationTime:timeOffset]; timeOffset = CMTimeAdd(timeOffset, gAnalysisFrames[i].duration); } while (adaptor.assetWriterInput.readyForMoreMediaData == FALSE) { NSLog(@"Waiting outside a loop"); NSDate *maxDate = [NSDate dateWithTimeIntervalSinceNow:0.1]; [[NSRunLoop currentRunLoop] runUntilDate:maxDate]; } buffer = pixelBufferFromCGImage(gAnalysisFrames[gAnalysisFrames.size()-1].frameImage, gWidth, gHeight); [adaptor appendPixelBuffer:buffer withPresentationTime:timeOffset]; NSLog(@"Finishing session"); //Finish the session: [writerInput markAsFinished]; [videoWriter endSessionAtSourceTime:timeOffset]; BOOL successfulWrite = [videoWriter finishWriting]; // if we failed to write the video if(!successfulWrite) { NSLog(@"Session failed with error: %@", [[videoWriter error] description]); // delete the temporary file created NSFileManager *fileManager = [NSFileManager defaultManager]; if ([fileManager fileExistsAtPath:path]) { NSError *error; if ([fileManager removeItemAtPath:path error:&error] == NO) { NSLog(@"removeItemAtPath %@ error:%@", path, error); } } } else { NSLog(@"Session complete"); } [writerInput release]; } 

After creating the video, it is then downloaded as AVAsset, and its track is extracted, and then the video is inserted, replacing the next line (from the first block of code in the original message).

 [mutableCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(CMTimeAdd(gFrames[i].startTime, timeOffset), gFrames[i].duration)]; 

from:

 [mutableCompositionTrack insertTimeRange:CMTimeRangeMake(timeOffset,gAnalysisFrames[i].duration) ofTrack:gFramesTrack atTime:CMTimeAdd(gAnalysisFrames[i].startTime, timeOffset) error:&gError]; 

where gFramesTrack is an AVAssetTrack created from a time frame video.

all code related to the CALayer and CABasicAnimation objects was deleted because it just didn't work.

Not the smartest solution, I don’t think, but it at least works. I hope someone finds this helpful.

This code also works on iOS devices (tested using iPad 3)

Note: The DebugLog function from the first message is just a callback to a function that prints log messages; if necessary, they can be replaced with NSLog () calls.

+3
source

Source: https://habr.com/ru/post/1205137/


All Articles