Long delay before watching video when creating AVPlayer in exportAsynchronouslyWithCompletionHandler

When playing a video exported from AVAssetExportSession , you hear the sound long before watching the video. Audio is played immediately, but video only appears after several recording cycles (i.e., it starts and ends). In other words, you hear the sound from the video several times before you see any images.

We use AutoLayout on iOS 8.

Using the following test, we allocated the task to exportAsynchronouslyWithCompletionHandler . In both blocks of code, we play an existing video - not related to export - so the export process was excluded as a variable.

Code 1 plays both video and audio at the beginning, while Code 2 plays only audio at the beginning and shows the video after a delay of 10-60 seconds (several times after video loops).

The only difference between the two code blocks is the use of exportAsynchronouslyWithCompletionHandler to play the video, and the other is not.

Help? Is it possible that the sound is exported first and ready to play before the video? Is something export related happening in another thread?

 func initPlayer(videoURL: NSURL) { // Create player player = AVPlayer(URL: videoURL) let playerItem = player.currentItem let asset = playerItem.asset playerLayer = AVPlayerLayer(player: player) playerLayer.frame = videoView.frame view.layer.addSublayer(playerLayer) player.seekToTime(kCMTimeZero) player.actionAtItemEnd = .None player.play() // Get notified when video done for looping purposes NSNotificationCenter.defaultCenter().addObserver(self, selector: "playerItemDidReachEnd:", name: AVPlayerItemDidPlayToEndTimeNotification, object: playerItem) // Log status println("Initialized video player: \(CMTimeGetSeconds(asset.duration)) seconds & \(asset.tracks.count) tracks for \(videoURL)") } func playExistingVideo() { let filename = "/ChopsticksVideo.mp4" let allPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true) let docsPath = allPaths[0] as! NSString let exportPath = docsPath.stringByAppendingFormat(filename) let exportURL = NSURL.fileURLWithPath(exportPath as String)! initPlayer(exportURL) } 

Code 1:

  // Create exporter let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality) exporter.videoComposition = videoComposition exporter.outputFileType = AVFileTypeMPEG4 exporter.outputURL = exportURL exporter.shouldOptimizeForNetworkUse = true playExistingVideo() 

Code 2:

  // Create exporter let exporter = AVAssetExportSession(asset: mainComposition, presetName: AVAssetExportPresetHighestQuality) exporter.videoComposition = videoComposition exporter.outputFileType = AVFileTypeMPEG4 exporter.outputURL = exportURL exporter.shouldOptimizeForNetworkUse = true // -- Export video exporter.exportAsynchronouslyWithCompletionHandler({ self.playExistingVideo() }) 
+2
source share
2 answers

I am going to suggest that the problem is here:

  // Create player player = AVPlayer(URL: videoURL) let playerItem = player.currentItem let asset = playerItem.asset playerLayer = AVPlayerLayer(player: player) playerLayer.frame = videoView.frame view.layer.addSublayer(playerLayer) player.seekToTime(kCMTimeZero) player.actionAtItemEnd = .None player.play() 

You see, when you create AVPlayer from the video URL, it enters a world that is not yet ready to play. Usually it can start playing sound pretty fast, but the video takes longer to prepare. This may explain the delay in the form of something.

Well, instead of waiting for the video to be ready, you just go ahead and say play() right away. Here is my suggestion. I suggest you do what I explained in my book (which is a link to the actual code): create a player and a layer, but then configure KVO so that you are notified when the player is ready to display, then add a layer and start playing.

In addition, I have one more suggestion. It seems to me that there is a danger that you are using this code by setting up your interface (with a layer) and saying play() on the background thread. This is likely to cause delays of various kinds. You seem to assume that the completion handler from exportAsynchronouslyWithCompletionHandler: is called in the main thread - and you go straight ahead and call the next method and therefore continue to customize your interface. This is a very risky assumption. In my experience, you should never assume that any AVFoundation completion handler is in the main thread. You must exit to the main thread using dispatch_async in your completion handler and continue only from there. If you look at the code to which I attached you, you will see that I am trying to do this.

+4
source

For those who stumble upon this question later, the answer was in the comments of the accepted answer. The dispatch_async section below:

 [exporter exportAsynchronouslyWithCompletionHandler:^(void){ dispatch_async(dispatch_get_main_queue(), ^{ switch (exporter.status) { case AVAssetExportSessionStatusCompleted: NSLog(@"Video Merge Successful"); break; case AVAssetExportSessionStatusFailed: NSLog(@"Failed:%@", exporter.error.description); break; case AVAssetExportSessionStatusCancelled: NSLog(@"Canceled:%@", exporter.error); break; case AVAssetExportSessionStatusExporting: NSLog(@"Exporting!"); break; case AVAssetExportSessionStatusWaiting: NSLog(@"Waiting"); break; default: break; } }); }]; 
+1
source

Source: https://habr.com/ru/post/989779/


All Articles