How to convert target c AVFoundation code to Swift?

I use AVFoundation in swift for shooting, but I cannot convert any functional lines of code from objective c to Swift. My function code:

- (void) capImage { //method to capture image from AVCaptureSession video feed AVCaptureConnection *videoConnection = nil; for (AVCaptureConnection *connection in stillImageOutput.connections) { for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo] ) { videoConnection = connection; break; } } if (videoConnection) { break; } } NSLog(@"about to request a capture from: %@", stillImageOutput); [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { if (imageSampleBuffer != NULL) { NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; [self processImage:[UIImage imageWithData:imageData]]; } }]; 

}

This line sends me an error. AnyObject [] does not match the sequencfe .. protocol:

  for (AVCaptureInputPort *port in [connection inputPorts]) { 

In quick:

  for port:AnyObject in connection.inputPorts { 

And I do not know how to convert this string:

  [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 

Can anyone help me convert to fast? Thanks!!

+6
source share
3 answers
 for (AVCaptureInputPort *port in [connection inputPorts]) { ) 

AnyObject arrays must be transferred to arrays of your actual type before the interaction, for example:

 for (port in connection.inputPorts as AVCaptureInputPort[]) { } 

In terms of blocks for closures, you just need to get the syntax correctly.

 stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) { (imageSampleBuffer, error) in // This line defines names the inputs //... } 

Note that this also uses closure closure syntax. Find out more about the documents!

EDIT: In terms of initializers, they now look like this:

 let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer) self.processImage(UIImage(data:imageData)) 
+6
source

try it

  var videoConnection :AVCaptureConnection? if let videoConnection = self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo){ self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (buffer:CMSampleBuffer!, error: NSError!) -> Void in if let exifAttachments = CMGetAttachment(buffer, kCGImagePropertyExifDictionary, nil) { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer) self.previewImage.image = UIImage(data: imageData) UIImageWriteToSavedPhotosAlbum(self.previewImage.image, nil, nil, nil) } }) } 
+1
source

This should answer the ports problem:

 if let videoConnection = stillImageOuput.connectionWithMediaType(AVMediaTypeVideo){//take a photo here} 
0
source

Source: https://habr.com/ru/post/971013/


All Articles