Photo with custom Swift 3 camera

in Swift 2.3 I used this code to take a picture in a user camera:

func didPressTakePhoto(){ if let videoConnection = stillImageOutput!.connection(withMediaType: AVMediaTypeVideo) { stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in if sampleBuffer != nil { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer) let dataProvider = CGDataProviderCreateWithCFData(imageData) let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault) let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right) self.captureImageView.image = image } }) } } 

But its line: stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in

Shows this error:

A value of type 'AVCapturePhotoOutput' does not have a member 'CaptureStillImageAsynchronouslyFromConnection'

I tried to solve my problem, but always get more and more errors, so I submit my original code.

Does anyone know how to make my code work again?

Thanks.

+5
source share
3 answers

Thanks to Sharpkits, I found my solution (this code works for me):

 func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) { if let error = error { print(error.localizedDescription) } if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) { let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: nil) let dataProvider = CGDataProvider(data: imageData as! CFData) let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: CGColorRenderingIntent.absoluteColorimetric) let image = UIImage(cgImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.right) let cropedImage = self.cropToSquare(image: image) let newImage = self.scaleImageWith(cropedImage, and: CGSize(width: 600, height: 600)) print(UIScreen.main.bounds.width) self.tempImageView.image = newImage self.tempImageView.isHidden = false } else { } } 
+4
source

You can use AVCapturePhotoOutput , like this in Swift 3:

You need an AVCapturePhotoCaptureDelegate that returns a CMSampleBuffer .

You can also get a preview image if you tell AVCapturePhotoSettings previewFormat

 class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate { let cameraOutput = AVCapturePhotoOutput() func capturePhoto() { let settings = AVCapturePhotoSettings() let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first! let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType, kCVPixelBufferWidthKey as String: 160, kCVPixelBufferHeightKey as String: 160, ] settings.previewPhotoFormat = previewFormat self.cameraOutput.capturePhoto(with: settings, delegate: self) } func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: NSError?) { if let error = error { print(error.localizedDescription) } if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) { print(image: UIImage(data: dataImage).size) } else { } } } 
+7
source

Great code. Thanks for the help and examples.

Just to clarify for yourself slower mental abilities like me, the capture (_... etc) method is called backstage when you call self.cameraOutput.capturePhoto (using: settings, delegate: self) inside your takePhoto method (or whatever you call) . You will never name the capture method directly. This happens automatically.

+2
source

Source: https://habr.com/ru/post/1258253/


All Articles