AVCaptureSession with multiple outputs?

I am currently developing an iOS application that applies CoreImage to a camera channel for taking photos and videos, and I ran into some problem.

So far, I have used AVCaptureVideoDataOutput to get trial buffers and manage them using CoreImage, and then displayed a simple preview, and also used it to capture photos and save them.

When I tried to make a video recording by writing SampleBuffers to the video, when I got them from AVCaptureVideoDataOutput , it had a very slow frame rate (probably due to another image related to the processing that was ongoing).

So, I was wondering: is it possible to have AVCaptureVideoDataOutput and AVCaptureMoveFileOutput simultaneously on the same AVCaptureSession?

I quickly switched to it and found that when I added additional output, my AVCaptureVideoDataOutput stopped receiving information.

If I can make it work, I hope this means that I can just use the second output to record video at a high frame rate and post-process the video after the user has stopped recording.

Any help would be greatly appreciated.

+12
ios avfoundation avcapturesession
Nov 09 2018-11-11T00:
source share
1 answer

It is easier than you think.

See: AVCamDemo

  • Capture data using AVCaptureVideoDataOutput.
  • Create a new dispatch queue before recording, for example. recordQueue: recordingQueue = dispatch_queue_create("Movie Recording Queue", DISPATCH_QUEUE_SERIAL);
  • In the captureOutput: didOutputSampleBuffer: fromConnection: delegate method, capture the sample buffer, save it and write the queue, write it to a file:

     -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CFRetain(sampleBuffer); dispatch_async(recordingQueue, ^{ if (assetWriter) { if (connection == videoConnection) { [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo]; } else if (connection == audioConnection) { [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeAudio]; } } CFRelease(sampleBuffer); }); } - (void) writeSampleBuffer:(CMSampleBufferRef)sampleBuffer ofType:(NSString *)mediaType { CMTime presentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); if ( assetWriter.status == AVAssetWriterStatusUnknown ) { if ([assetWriter startWriting]) { [assetWriter startSessionAtSourceTime:presentationTime]; } else { NSLog(@"Error writing initial buffer"); } } if ( assetWriter.status == AVAssetWriterStatusWriting ) { if (mediaType == AVMediaTypeVideo) { if (assetWriterVideoIn.readyForMoreMediaData) { if (![assetWriterVideoIn appendSampleBuffer:sampleBuffer]) { NSLog(@"Error writing video buffer"); } } } else if (mediaType == AVMediaTypeAudio) { if (assetWriterAudioIn.readyForMoreMediaData) { if (![assetWriterAudioIn appendSampleBuffer:sampleBuffer]) { NSLog(@"Error writing audio buffer"); } } } } } 
+3
Feb 26 '14 at 9:53
source share



All Articles