I created a custom camera similar to snapchat with AVFoundation. The aspect of the image works: take a picture and I will show the image. Now I'm trying to take a video and display the video.
I captured a successful video file, tested it and worked with MPMoviePlayer, but could not play it on AVPlayer.
- (void)takeVideoAction:(id)sender { NSString *outputPath = [NSTemporaryDirectory() stringByAppendingPathComponent:@"output.mp4"]; NSFileManager *manager = [[NSFileManager alloc] init]; if ([manager fileExistsAtPath:outputPath]){ [manager removeItemAtPath:outputPath error:nil]; } [movieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputPath] recordingDelegate:self]; }
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{ AVURLAsset *asset = [AVURLAsset URLAssetWithURL:outputFileURL options:nil]; [asset loadValuesAsynchronouslyForKeys:@[@"tracks"] completionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ NSError *error = nil; if([asset statusOfValueForKey:@"tracks" error:&error] == AVKeyValueStatusFailed){ NSLog(@"error: %@", [error description]); } else{ self.playerItem = [AVPlayerItem playerItemWithAsset:asset]; self.player = [AVPlayer playerWithPlayerItem:self.playerItem]; self.playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player]; self.playerLayer.frame = self.view.bounds; [self.view.layer addSublayer:self.playerLayer]; [self.player play]; } }); }]; }
This is my first experience using AVFoundation and AVPlayer. Can't I [self.player play] after the resource has loaded? Do I need to wait until the item status is AVPlayerItemStatusReadyToPlay ?
ERROR: Domain Error = NSURLErrorDomain Code = -1100 "The requested URL was not found on this server."
Peter source share