How to send streaming video from an iOS device to a server?

I have to send real-time video from iPhone to the server. I create a capture session and use AVCaptureMovieFileOutput.

NSError *error = nil;

 captureSession = [[AVCaptureSession alloc] init]; // find, attach devices AVCaptureDevice *muxedDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeMuxed]; if (muxedDevice) { NSLog (@"got muxedDevice"); AVCaptureDeviceInput *muxedInput = [AVCaptureDeviceInput deviceInputWithDevice:muxedDevice error:&error]; if (muxedInput) { [captureSession addInput:muxedInput]; } } else { AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo]; if (videoDevice) { NSLog (@"got videoDevice"); AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; if (videoInput) { [captureSession addInput: videoInput]; } } AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio]; if (audioDevice) { NSLog (@"got audioDevice"); AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:&error]; if (audioInput) { [captureSession addInput: audioInput]; } } } // create a preview layer from the session and add it to UI AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.frame = view.layer.bounds; previewLayer.videoGravity = AVLayerVideoGravityResizeAspect; previewLayer.orientation = AVCaptureVideoOrientationPortrait; [view.layer addSublayer:previewLayer]; // create capture file output captureMovieOutput = [[AVCaptureMovieFileOutput alloc] init]; if (! captureMovieURL) { captureMoviePath = [[self getMoviePathWithName:MOVIE_FILE_NAME] retain]; captureMovieURL = [[NSURL alloc] initFileURLWithPath:captureMoviePath]; } NSLog (@"recording to %@", captureMovieURL); [captureSession addOutput:captureMovieOutput]; 

code>

I use AVAssetExportSession to receive a video lasting 10 seconds.

  AVURLAsset *asset = [AVURLAsset URLAssetWithURL:captureMovieURL options:[NSDictionary dictionaryWithObject:@"YES" forKey:AVURLAssetPreferPreciseDurationAndTimingKey]]; 
 AVMutableComposition *composition = [AVMutableComposition composition]; CMTime endTime; CMTime duration = CMTimeMake(6000, 600); if (asset.duration.value - startFragment.value < 6000) { endTime = asset.duration; } else { endTime = CMTimeMake(startFragment.value + 6000, 600); } CMTimeRange editRange = CMTimeRangeMake(startFragment, duration); startFragment = CMTimeMake(endTime.value, 600); NSError *editError = nil; // and add into your composition 

[ insertTimeRange: editRange ofAsset: atTime: .duration: & editError];

AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetPassthrough]; exportSession.shouldOptimizeForNetworkUse = YES; NSString *name = [NSString stringWithFormat:MOVUE_SEGMENT_NAME, countMovies]; NSString *path = [NSString stringWithFormat:@"file://localhost%@", [self getMoviePathWithName:name]]; NSURL *url = [NSURL URLWithString:path]; NSLog(@"urlsegment = %@", url); exportSession.outputFileType = AVFileTypeMPEG4; exportSession.outputURL = url; [exportSession exportAsynchronouslyWithCompletionHandler:^{ if (AVAssetExportSessionStatusCompleted == exportSession.status) { countMovies++; NSLog(@"AVAssetExportSessionStatusCompleted"); } else if (AVAssetExportSessionStatusFailed == exportSession.status) { NSLog(@"AVAssetExportSessionStatusFailed: %@", [exportSession.error localizedDescription]); } else { NSLog(@"Export Session Status: %d", exportSession.status); } }];

code>

I send the video to the server if the export session status is completed. But it is very slow. To get a movie lasting 10 seconds, and then send the necessary 15 seconds to the server. If the film size is less than 10 seconds, nothing changes. How can I solve this problem? What is the best way to do this? How can I solve this problem? What is better to use for streaming video on a server?

+4
source share
1 answer

using ffmpeg to encode metadata is probably better than AVAssetExportSession. But ffmpeg coding is much more complicated than AVAssetExportSession;

0
source

Source: https://habr.com/ru/post/1397750/


All Articles