I am trying to combine the two videos that I get after recording using the camera as a UIImagePickerController. I managed to merge the video into one, but I have some problems with the orientation of the video.
As I understand it, with the UIImagePickerController is that all the videos are captured in the landscape, which means that the videos recorded in the portrait are rotated 90 °.
After each recording, I add a new video to the array
func imagePickerController(picker: UIImagePickerController!, didFinishPickingMediaWithInfo info:NSDictionary) { let tempImage = info[UIImagePickerControllerMediaURL] as NSURL videos.append(tempImage) let pathString = tempImage.relativePath self.dismissViewControllerAnimated(true, completion: {}) }
Then, when I want to join, I go through each video and create an instruction and add the instruction to another array
var composition = AVMutableComposition() let trackVideo:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID()) let trackAudio:AVMutableCompositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: CMPersistentTrackID()) var insertTime = kCMTimeZero for i in 0...(videos.count-1){ let moviePathUrl = videos[i] let sourceAsset = AVURLAsset(URL: moviePathUrl, options: nil) let tracks = sourceAsset.tracksWithMediaType(AVMediaTypeVideo) let audios = sourceAsset.tracksWithMediaType(AVMediaTypeAudio) if tracks.count > 0{ var videoDuration = CMTimeRangeMake(kCMTimeZero, sourceAsset.duration); let assetTrack:AVAssetTrack = tracks[0] as AVAssetTrack let assetTrackAudio:AVAssetTrack = audios[0] as AVAssetTrack trackVideo.insertTimeRange(videoDuration, ofTrack: assetTrack, atTime: insertTime, error: nil) trackAudio.insertTimeRange(videoDuration, ofTrack: assetTrackAudio, atTime: insertTime, error: nil)
When I create all the instructions, I add them to the main statement and create an export session.
var instruction = AVMutableVideoCompositionInstruction(); instruction.timeRange = CMTimeRangeMake(kCMTimeZero, insertTime); instruction.layerInstructions = instructions; var mainCompositionInst = AVMutableVideoComposition() mainCompositionInst.instructions = NSArray(object: instruction) mainCompositionInst.frameDuration = CMTimeMake(1, 60); mainCompositionInst.renderSize = CGSizeMake(300, 300); var exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetHighestQuality) exporter.videoComposition = mainCompositionInst;
What am I missing?