Get RGB pixel buffer from ARKit

I am trying to get CVPixelBuffer in RGB color space from Apple ARKit. In the func session(_ session: ARSession, didUpdate frame: ARFrame) method of ARSessionDelegate I get an instance of ARFrame . On the Mapping AR experience with metal page, I found that this pixel buffer is in the YCbCr (YUV) color space.

I need to convert this to an RGB color space (I really need CVPixelBuffer , not UIImage ). I found something about color conversion in iOS, but I couldn't get it to work in Swift 3.

+5
source share
3 answers

There are several ways to do this, depending on what you need. The best way to do this in real time (say, render a buffer for presentation) is to use a custom shader to convert YCbCr CVPixelBuffer to RGB.

Use of metal: If you create a new project, select "Augmented Reality Application" and select "Metal" for the content technology, the created project will contain the code and shaders necessary for this conversion.

Using OpenGL: Apple's GLCameraRipple example uses AVCaptureSession to capture a camera and shows how to match the resulting CVPixelBuffer with GL textures, which are then converted to RGB in shaders (again, in the example).

Not in real time: The answer at fooobar.com/questions/234968 / ... aims to convert the buffer to UIImage and offers a fairly simple way to do this.

+4
source

docs explicitly says that you need to access the luminance and color planes:

ARKit captures pixel buffers in the flat format YCbCr (also known as YUV). To display these images on the device’s display, you need to access the luminance and color planes of the pixel buffer and convert the pixel values ​​to RGB format.

Thus, there is no way to directly receive RGB-planes, and you will have to process this in your shaders, both in Metal and in openGL, as described by @joshue

+2
source

I am also stuck on this issue for several days. The entire code snippet I could find in initernet is written more in objective-c than in swift, regarding the conversion of CVPixelBuffer to UIImage.

Finally, the following code snippet is great for me to convert a YUV image to jpg or png format, and then you can write it to a local file in your application.

 func pixelBufferToUIImage(pixelBuffer: CVPixelBuffer) -> UIImage { let ciImage = CIImage(cvPixelBuffer: pixelBuffer) let context = CIContext(options: nil) let cgImage = context.createCGImage(ciImage, from: ciImage.extent) let uiImage = UIImage(cgImage: cgImage!) return uiImage } 
+1
source

Source: https://habr.com/ru/post/1268637/


All Articles