Since AVCaptureVideoPreviewLayer implemented as an OpenGL layer, you cannot use the usual CoreGraphic context. I can suggest trying to access the raw data.
Add AVCaptureVideoDataOutput with a delegate:
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) let captureVideoOutput = AVCaptureVideoDataOutput() captureVideoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main) captureSession?.addOutput(captureVideoOutput) previewLayer.frame = localView.bounds
Connect your controller (or something else) to AVCaptureVideoDataOutputSampleBufferDelegate .
Declare the shouldCaptureFrame variable and adjust it when you need to take a snapshot.
var shouldCaptureFrame: Bool = false ... func takeSnapshot() { shouldCaptureFrame = true }
And implement didOutputSampleBuffer from the delegate:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) { if !shouldCaptureFrame { return } let image = UIImage.from(sampleBuffer: sampleBuffer) shouldCaptureFrame = false }
Finally, an extension using the from(sampleBuffer:) function:
extension UIImage { static func from(sampleBuffer: CMSampleBuffer) -> UIImage? { guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil } CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0)) let baseAddresses = CVPixelBufferGetBaseAddress(imageBuffer) let colorSpace = CGColorSpaceCreateDeviceRGB() let context = CGContext( data: baseAddresses, width: CVPixelBufferGetWidth(imageBuffer), height: CVPixelBufferGetHeight(imageBuffer), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(imageBuffer), space: colorSpace, bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue ) let quartzImage = context?.makeImage() CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0)) if let quartzImage = quartzImage { let image = UIImage(cgImage: quartzImage) return image } return nil } }
source share