There are several ways to do this, depending on what you need. The best way to do this in real time (say, render a buffer for presentation) is to use a custom shader to convert YCbCr CVPixelBuffer to RGB.
Use of metal: If you create a new project, select "Augmented Reality Application" and select "Metal" for the content technology, the created project will contain the code and shaders necessary for this conversion.
Using OpenGL: Apple's GLCameraRipple example uses AVCaptureSession to capture a camera and shows how to match the resulting CVPixelBuffer with GL textures, which are then converted to RGB in shaders (again, in the example).
Not in real time: The answer at fooobar.com/questions/234968 / ... aims to convert the buffer to UIImage and offers a fairly simple way to do this.
source share