Avcapturesession stoprunning takes a lot of time

I have an application using Zxing. when the user enters the scanner page, zxing is initialized instantly without clicking the "scan" button. So, I ran into a problem like this: AVCaptureSession - Stop Running - the proposed solution has been working for a long time . Unfortunately, when the user does some quick actions, for example, go to the settings page (with capture stop) and quickly return to the scanner page (with capture start), the application crashes.

I solved this by tilting the camera in the async dispacher in my user queue and initializing it on the synchronization manager in the same queue. It works fine until initialization is called before the camera stops. In this case, the initialization code waits for the end of the camera stop block, BUT this second one takes ~ 10 seconds! (usually it takes about 1-2). I have no idea what is going on.

Do you have any suggestions?

here is the code:

- (void)stopCapture { #if HAS_AVFF decoding = NO; [self.prevLayer removeFromSuperlayer]; dispatch_async(myQueue, ^{ NSLog(@"stop capture"); [captureSession stopRunning]; NSLog(@"session Stopped"); if(captureSession.inputs.count>0) { AVCaptureInput* input = [captureSession.inputs objectAtIndex:0]; [captureSession removeInput:input]; } if(captureSession.outputs.count>0) { AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[captureSession.outputs objectAtIndex:0]; [captureSession removeOutput:output]; } self.prevLayer = nil; self.captureSession = nil; NSLog(@"end stop capture"); }); #endif } - (void)initCapture { dispatch_sync(myQueue, ^{ NSLog(@"init capture"); #if HAS_AVFF AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice: [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil]; if(!captureInput) { NSLog(@"ERROR - CaptureInputNotInitialized"); } AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init]; captureOutput.alwaysDiscardsLateVideoFrames = YES; if(!captureOutput) { NSLog(@"ERROR - CaptureOutputNotInitialized"); } [captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; [captureOutput setVideoSettings:videoSettings]; self.captureSession = [[[AVCaptureSession alloc] init] autorelease]; self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; // 480x360 on a 4 if([self.captureSession canAddInput:captureInput]) { [self.captureSession addInput:captureInput]; } else { NSLog(@"ERROR - cannot add input"); } if([self.captureSession canAddOutput:captureOutput]) { [self.captureSession addOutput:captureOutput]; } else { NSLog(@"ERROR - cannot add output"); } [captureOutput release]; if (!self.prevLayer) { [self.prevLayer release]; } self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession]; self.prevLayer.frame = self.view.bounds; self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [self.view.layer addSublayer: self.prevLayer]; [self.captureSession startRunning]; #endif NSLog(@"end init"); }); 

}

+4
source share

Source: https://habr.com/ru/post/1489879/


All Articles