AVAssetExportSession rotation error when video is in camera roll

I am trying to convert .mov video to .mp4 and at the same time correct orientation. The code I use below works fine when recording videos using the UIImagePickerController , however, if the video is selected from the camera’s movie, I get this error and I don’t understand why:

Export error: Stopped operation: Error Domain = AVFoundationErrorDomain Code = -11841 "Operation stopped" UserInfo = 0x1815ca50 {NSLocalizedDescription = Operation stopped, NSLocalizedFailureReason = The video could not be compiled.}

I tried to save the video to another file first, but that didn't matter.

Here is the code I use to convert the video:

 - (void)convertVideoToLowQuailtyAndFixRotationWithInputURL:(NSURL*)inputURL handler:(void (^)(NSURL *outURL))handler { if ([[inputURL pathExtension] isEqualToString:@"MOV"]) { NSURL *outputURL = [inputURL URLByDeletingPathExtension]; outputURL = [outputURL URLByAppendingPathExtension:@"mp4"]; AVURLAsset *avAsset = [AVURLAsset URLAssetWithURL:inputURL options:nil]; AVAssetTrack *sourceVideoTrack = [[avAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVAssetTrack *sourceAudioTrack = [[avAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; AVMutableComposition* composition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil]; [compositionVideoTrack setPreferredTransform:sourceVideoTrack.preferredTransform]; AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:sourceAudioTrack atTime:kCMTimeZero error:nil]; AVMutableVideoComposition *videoComposition = [self getVideoComposition:avAsset]; NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:avAsset]; if ([compatiblePresets containsObject:AVAssetExportPresetMediumQuality]) { AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetMediumQuality]; exportSession.outputURL = outputURL; exportSession.outputFileType = AVFileTypeMPEG4; exportSession.shouldOptimizeForNetworkUse = YES; exportSession.videoComposition = videoComposition; [exportSession exportAsynchronouslyWithCompletionHandler:^{ switch ([exportSession status]) { case AVAssetExportSessionStatusFailed: NSLog(@"Export failed: %@ : %@", [[exportSession error] localizedDescription], [exportSession error]); handler(nil); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Export canceled"); handler(nil); break; default: handler(outputURL); break; } }]; } } else { handler(inputURL); } } - (AVMutableVideoComposition *)getVideoComposition:(AVAsset *)asset { AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; AVMutableComposition *composition = [AVMutableComposition composition]; AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition]; CGSize videoSize = videoTrack.naturalSize; BOOL isPortrait_ = [self isVideoPortrait:asset]; if(isPortrait_) { // NSLog(@"video is portrait "); videoSize = CGSizeMake(videoSize.height, videoSize.width); } composition.naturalSize = videoSize; videoComposition.renderSize = videoSize; videoComposition.frameDuration = CMTimeMakeWithSeconds( 1 / videoTrack.nominalFrameRate, 600); AVMutableCompositionTrack *compositionVideoTrack; compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:videoTrack atTime:kCMTimeZero error:nil]; AVMutableVideoCompositionLayerInstruction *layerInst; layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; [layerInst setTransform:videoTrack.preferredTransform atTime:kCMTimeZero]; AVMutableVideoCompositionInstruction *inst = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; inst.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration); inst.layerInstructions = [NSArray arrayWithObject:layerInst]; videoComposition.instructions = [NSArray arrayWithObject:inst]; return videoComposition; } 
+6
source share
3 answers

AVFoundation Error Constant -11841 means that you have an invalid video composition. See this link if you want more information about error constants: https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVFoundation_ErrorConstants/Reference/reference.html

While serious errors do not appear on me, I can suggest the following ways to narrow down the source of your problem.

First, instead of passing nil for the error parameter in these calls:

 [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:nil]; 

create an NSError object and pass a reference to it as follows:

 NSError *error = nil; [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, avAsset.duration) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:&error]; 

Check the error to make sure your video and audio tracks are correctly inserted into the composition track. The error should be nil if everything goes well.

 if(error) NSLog(@"Insertion error: %@", error); 

You can also check your AVAsset composable and exportable and hasProtectedContent . If it is not YES, YES and NO, respectively, there may be a problem creating a new video file.

Sometimes I saw a problem when creating a time range for an audio track does not like the 600 time scale when used in a composition with a video track. You might want to create a new CMTime for the duration (avAsset.duration) in

 CMTimeRangeMake(kCMTimeZero, avAsset.duration) 

only to insert an audio track. In the new CMTime, use the 44100 timeline (or regardless of the sample rate of the audio track). The same goes for your videoComposition.frameDuration . Depending on the nominalFrameRate your video track, your time may not display correctly at nominalFrameRate scale.

Finally, there is a useful tool provided by Apple for debugging videos:

https://developer.apple.com/library/mac/samplecode/AVCompositionDebugViewer/Introduction/Intro.html

It gives a visual representation of your composition, and you can see where things don't look as they should.

+13
source

Try to comment on the line below and run the project

 exportSession.videoComposition = videoComposition; 
+2
source

You should definitely use the isValidForAsset: timeRange: validationDelegate: method from AVVideoCompostion, it will diagnose any problem with your video composition. I had the same problem and the solution for me was to create a layerInstruction using AVMutableCompositionTrack instead of the original track:

 layerInst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack]; 
+1
source

Source: https://habr.com/ru/post/958862/


All Articles