AVCapture Session for SWIFT Image Capture

I created an AVCaptureSession to capture video output and display it to the user through a UIView. Now I want to be able to click the button (takePhoto method) and display the image from the session in UIImageView. I tried iterating through each connection to the device and trying to save the output, but that didn't work. The code I have is below

let captureSession = AVCaptureSession() var stillImageOutput: AVCaptureStillImageOutput! @IBOutlet var imageView: UIImageView! @IBOutlet var cameraView: UIView! // If we find a device we'll store it here for later use var captureDevice : AVCaptureDevice? override func viewDidLoad() { // Do any additional setup after loading the view, typically from a nib. super.viewDidLoad() println("I AM AT THE CAMERA") captureSession.sessionPreset = AVCaptureSessionPresetLow self.captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) if(captureDevice != nil){ beginSession() } } func beginSession() { self.stillImageOutput = AVCaptureStillImageOutput() self.captureSession.addOutput(self.stillImageOutput) var err : NSError? = nil self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err)) if err != nil { println("error: \(err?.localizedDescription)") } var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession) self.cameraView.layer.addSublayer(previewLayer) previewLayer?.frame = self.cameraView.layer.frame captureSession.startRunning() } @IBAction func takePhoto(sender: UIButton) { self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo)) { (buffer:CMSampleBuffer!, error:NSError!) -> Void in var image = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer) var data_image = UIImage(data: image) self.imageView.image = data_image } } } 
+6
source share
1 answer

You should try adding a new stream when adding input and output to the session before starting it. Apple documentation states

Important. The startRunning method is a blocking call that can take some time, so you must configure the session in a sequential queue so that the main queue does not block (which distracts the user interface). See AVCam for iOS for an example of a canonical implementation.

Try using send in the session creation method like below

 dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0), { // 1 self.captureSession.addOutput(self.stillImageOutput) self.captureSession.addInput(AVCaptureDeviceInput(device: self.captureDevice, error: &err)) self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto if err != nil { println("error: \(err?.localizedDescription)") } var previewLayer = AVCaptureVideoPreviewLayer(session: self.captureSession) previewLayer?.frame = self.cameraView.layer.bounds previewLayer?.videoGravity = AVLayerVideoGravityResizeAspectFill dispatch_async(dispatch_get_main_queue(), { // 2 // 3 self.cameraView.layer.addSublayer(previewLayer) self.captureSession.startRunning() }); }); 
+4
source

Source: https://habr.com/ru/post/986190/


All Articles