Multiple AVCaptureVideoDataOutput in the same AVCaptureSession

I was wondering if it is possible to add multiple AVCaptureVideoDataOutput to AVCaptureSession using the input of a single camera device?

My experiments show that adding a second VideoDataOutput will cause canAddOutput to return NO. But I can not find anywhere in the Apple documentation, says that the output of several data is prohibited.

+5
source share
1 answer

We cannot use a single AVCaptureSession for multiple AVCaptureVideoDataOutput objects.

What you can do, you can do multiple AVCaptureVideoDataOutput with multiple AVCaptureSession objects.

You can create two different settings AVCaptureVideoDataOutput and AVCaptureSession , after which you can use them one by one in the application, and you can achieve the goal.

In my case, I had to capture the image front and back using the camera at a time.

I created two different objects for AVCaptureVideoDataOutput and AVCaptureSession , as shown below.

 /* Front camera settings */ @property bool isFrontRecording; @property (strong, nonatomic) AVCaptureDeviceInput *videoInputBack; @property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputBack; @property (strong, nonatomic) AVCaptureSession *sessionBack; /* Back camera settings */ @property bool isBackRecording; @property (strong, nonatomic) AVCaptureDeviceInput *videoInputFront; @property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputFront; @property (strong, nonatomic) AVCaptureSession *sessionFront; 

Now the boot was loaded first, and I installed the rear camera and set the flags to start the recording session or not for both sessions, as shown below.

 - (void)viewDidLoad { [super viewDidLoad]; [self setupBackAVCapture]; self.isFrontRecording = NO; self.isBackRecording = NO; } - (void)setupBackAVCapture { NSError *error = nil; self.sessionBack = [[AVCaptureSession alloc] init]; self.sessionBack.sessionPreset = AVCaptureSessionPresetPhoto; AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; self.videoInputBack = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error]; [self.sessionBack addInput:self.videoInputBack]; self.imageOutputBack = [[AVCaptureStillImageOutput alloc] init]; [self.sessionBack addOutput:self.imageOutputBack]; } 

Now, whenever a user starts taking a photo, we will take the front photo using the code below.

 - (IBAction)buttonCapture:(id)sender { [self takeBackPhoto]; } - (void)takeBackPhoto { [self.sessionBack startRunning]; if (!self.isFrontRecording) { self.isFrontRecording = YES; AudioServicesPlaySystemSound(kSystemSoundID_Vibrate); AVCaptureConnection *videoConnection = [self.imageOutputBack connectionWithMediaType:AVMediaTypeVideo]; if (videoConnection == nil) { return; } [self.imageOutputBack captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer == NULL) { return; } NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]; UIImage *image = [[UIImage alloc] initWithData:imageData]; UIImageWriteToSavedPhotosAlbum(image, self, nil, nil); [self.imageView setImage:image]; [self.sessionBack stopRunning]; // Set up front camera setting and capture photo. [self setupFrontAVCapture]; [self takeFrontPhoto]; }]; self.isFrontRecording = NO; } } 

Once the inverse image is captured, we will configure the session to capture the frontal image using the setupFrontAVCapture method, and then we will capture the front image using the takeFrontPhoto method, as described below.

 - (void)setupFrontAVCapture { NSError *error = nil; self.sessionFront = [[AVCaptureSession alloc] init]; self.sessionFront.sessionPreset = AVCaptureSessionPresetPhoto; AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; camera = [self cameraWithPosition:AVCaptureDevicePositionFront]; self.videoInputFront = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error]; [self.sessionFront addInput:self.videoInputFront]; self.imageOutputFront = [[AVCaptureStillImageOutput alloc] init]; [self.sessionFront addOutput:self.imageOutputFront]; } - (void)takeFrontPhoto { [self.sessionFront startRunning]; if (!self.isBackRecording) { self.isBackRecording = YES; AudioServicesPlaySystemSound(kSystemSoundID_Vibrate); AVCaptureConnection *videoConnection = [self.imageOutputFront connectionWithMediaType:AVMediaTypeVideo]; if (videoConnection == nil) { return; } [self.imageOutputFront captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer == NULL) { return; } NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]; UIImage *image = [[UIImage alloc] initWithData:imageData]; UIImageWriteToSavedPhotosAlbum(image, self, nil, nil); [self.imageViewBack setImage:image]; [self.sessionFront stopRunning]; }]; self.isBackRecording = NO; } } 

This way you can use two different sets of objects AVCaptureSession and AVCaptureStillImageOutput , and you can achieve your goal.

Please let me know if you have any confusion.

+3
source

Source: https://habr.com/ru/post/1268891/


All Articles