The correct way to optimize my AVCaptureSession?

I got my AVCaptureSession to work, and it almost completely duplicates the user interface of Camera.app, however after a few seconds the application will crash, and I just can’t find what I'm doing wrong. I really hope someone knows how to optimize this!

I AM using ARC; and again, the whole session is working fine, but will work a bit later. The delegate method of AVCaptureSession is called as EVERY second seems. If there is a way to call this method only when the user clicks the “take a picture” button, how can I do this while maintaining a “live” preview level?

Thanks in advance!

Session setup

NSError *error = nil; session = [[AVCaptureSession alloc] init]; session.sessionPreset = AVCaptureSessionPresetMedium; device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; [session addInput:input]; output = [[AVCaptureVideoDataOutput alloc] init]; [session addOutput:output]; dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL); [output setSampleBufferDelegate:self queue:queue]; dispatch_release(queue); output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; if(version >= 4.0 && version < 5.0) { output.minFrameDuration = CMTimeMake(1, 15); } output.alwaysDiscardsLateVideoFrames = YES; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [self.view.layer addSublayer:previewLayer]; [self.view addSubview:camera_overlay]; [session startRunning]; 

AVCaptureSession The delegate that is invoked:

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer]; return capture_image; } 

Method that gets UIImage from the fetch buffer

 - (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferLockBaseAddress(imageBuffer, 0); void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); CGImageRef quartzImage = CGBitmapContextCreateImage(context); CVPixelBufferUnlockBaseAddress(imageBuffer,0); CGContextRelease(context); CGColorSpaceRelease(colorSpace); UIImage *image = [UIImage imageWithCGImage:quartzImage]; CGImageRelease(quartzImage); return image; } 
+4
source share
2 answers

Take a look at Apple’s AVCam Demo app for a complete example.

Method

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { 

is called every time the camera frame is ready, and in your case it is called 15 times per second, or at least it should be called 15 times, because you specified the frame rate as output.minFrameDuration = CMTimeMake(1, 15);

From the code you pointed out, the only reason I can think of is because you are not releasing UIImage *capture_image

You can use Xcode tools to profile your application and see why this happens: Tools Guide

The Leaks tool is your first stop in your case, there are a lot of manuals on the Internet for it, and here is one of them: Tracking iPhone memory leaks that was written by SO OwenGross user, if I'm not mistaken from here

+5
source

The post looks pretty old, but if someone sees this:

To whom do you return the image in the delegate method (

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer]; return capture_image; } 

)?

You can use the button that raises the flag. In the delegate method, check if the flag has been raised, and only then create an image. The image must be an instance variable, otherwise it will be lost anyway.

There are also delegation methods for capturing an image captureStillImageAsynchronouslyFromConnection

0
source

Source: https://habr.com/ru/post/1382636/


All Articles