I just solved it. Thanks to some brilliant help on the right track. This is the code I have.
Basically, now I can make an image from a drawn overlay and camera. But they still cannot combine them. There seems to be very little useful code that I can find that makes this simple.
Thus, the important part is the extension block at the top right, and the additions to func saveToCamera () at the bottom of the code. In short, I now have two images that I need, I think. The capture of myImage appears on a white background - so I'm not sure if this is natural - or not. This is as shown on the simulator. So it can be just natural.
Image 1. Screen capture. 
Image 2. Saved image of myImage as explained. 
import UIKit import AVFoundation import Foundation // extension must be outside class extension UIImage { convenience init(view: UIView) { UIGraphicsBeginImageContext(view.frame.size) view.layer.render(in: UIGraphicsGetCurrentContext()!) let image = UIGraphicsGetImageFromCurrentImageContext() UIGraphicsEndImageContext() self.init(cgImage: (image?.cgImage)!) } } class ViewController: UIViewController { @IBOutlet weak var navigationBar: UINavigationBar! @IBOutlet weak var imgOverlay: UIImageView! @IBOutlet weak var btnCapture: UIButton! @IBOutlet weak var shapeLayer: UIView! let captureSession = AVCaptureSession() let stillImageOutput = AVCaptureStillImageOutput() var previewLayer : AVCaptureVideoPreviewLayer? //var shapeLayer : CALayer? // If we find a device we'll store it here for later use var captureDevice : AVCaptureDevice? override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. //======================= let midX = self.view.bounds.midX let midY = self.view.bounds.midY for index in 1...10 { let circlePath = UIBezierPath(arcCenter: CGPoint(x: midX,y: midY), radius: CGFloat((index * 10)), startAngle: CGFloat(0), endAngle:CGFloat(M_PI * 2), clockwise: true) let shapeLayerPath = CAShapeLayer() shapeLayerPath.path = circlePath.cgPath //change the fill color shapeLayerPath.fillColor = UIColor.clear.cgColor //you can change the stroke color shapeLayerPath.strokeColor = UIColor.blue.cgColor //you can change the line width shapeLayerPath.lineWidth = 0.5 // add the blue-circle layer to the shapeLayer ImageView shapeLayer.layer.addSublayer(shapeLayerPath) } print("Shape layer drawn") //===================== captureSession.sessionPreset = AVCaptureSessionPresetHigh if let devices = AVCaptureDevice.devices() as? [AVCaptureDevice] { // Loop through all the capture devices on this phone for device in devices { // Make sure this particular device supports video if (device.hasMediaType(AVMediaTypeVideo)) { // Finally check the position and confirm we've got the back camera if(device.position == AVCaptureDevicePosition.back) { captureDevice = device if captureDevice != nil { print("Capture device found") beginSession() } } } } } } @IBAction func actionCameraCapture(_ sender: AnyObject) { print("Camera button pressed") saveToCamera() } func beginSession() { do { try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice)) stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG] if captureSession.canAddOutput(stillImageOutput) { captureSession.addOutput(stillImageOutput) } } catch { print("error: \(error.localizedDescription)") } guard let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) else { print("no preview layer") return } // this is what displays the camera view. But - it on TOP of the drawn view, and under the overview. ?? self.view.layer.addSublayer(previewLayer) previewLayer.frame = self.view.layer.frame captureSession.startRunning() print("Capture session running") self.view.addSubview(navigationBar) //self.view.addSubview(imgOverlay) self.view.addSubview(btnCapture) // shapeLayer ImageView is already a subview created in IB // but this will bring it to the front self.view.addSubview(shapeLayer) } func saveToCamera() { if let videoConnection = stillImageOutput.connection(withMediaType: AVMediaTypeVideo) { stillImageOutput.captureStillImageAsynchronously(from: videoConnection, completionHandler: { (CMSampleBuffer, Error) in if let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(CMSampleBuffer) { if let cameraImage = UIImage(data: imageData) { // cameraImage is the camera preview image. // I need to combine/merge it with the myImage that is actually the blue circles. // This converts the UIView of the bllue circles to an image. Uses 'extension' at top of code. let myImage = UIImage(view: self.shapeLayer) print("converting myImage to an image") UIImageWriteToSavedPhotosAlbum(cameraImage, nil, nil, nil) } } }) } } override func didReceiveMemoryWarning() { super.didReceiveMemoryWarning() // Dispose of any resources that can be recreated. } }
source share