Getting a screenshot through CVOpenGLESTextureCacheCreate

I have an application similar to the GLPaint example (but on OpenGL ES 2.0). At some points I want to get screenshots of the picture. I already read this topic

but I don’t understand at what point should I call CVOpenGLESTextureCacheCreate and do other things. Who can help me?

+1
source share
3 answers

The code that I am describing in the answer you provide describes the creation of a pixel buffer, the extraction of its corresponding texture and the binding of this texture as the output of the framebuffer. You use this code once to tune the framebuffer to which you will make your scene.

Whenever you want to capture this texture, you probably want to use glFinish() to lock until all OpenGL ES rendering is complete, and then use the code that I describe there:

 CVPixelBufferLockBaseAddress(renderTarget, 0); _rawBytesForImage = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget); // Do something with the bytes CVPixelBufferUnlockBaseAddress(renderTarget, 0); 

to extract raw bytes for a texture containing an image of your scene.

The internal byte ordering of iOS texture bytes is BGRA, so you want to use something like the following to create a CGImageRef from these bytes:

 // It appears that the width of a texture must be padded out to be a multiple of 8 (32 bytes) if reading from it using a texture cache NSUInteger paddedWidthOfImage = CVPixelBufferGetBytesPerRow(renderTarget) / 4.0; NSUInteger paddedBytesForImage = paddedWidthOfImage * (int)currentFBOSize.height * 4; dataProvider = CGDataProviderCreateWithData((__bridge_retained void*)self, _rawBytesForImage, paddedBytesForImage, dataProviderUnlockCallback); cgImageFromBytes = CGImageCreate((int)currentFBOSize.width, (int)currentFBOSize.height, 8, 32, CVPixelBufferGetBytesPerRow(renderTarget), defaultRGBColorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst, dataProvider, NULL, NO, kCGRenderingIntentDefault); 

In the above example, I use the dataProviderUnlockCallback() function to handle unlocking the pixel buffer and safely resume rendering, but you can probably ignore this in your case and just pass the parameter there to NULL.

+2
source

Another variant:

  • Create a new CGBitmapContext
  • Call [layer renderInContext: yourNewContext]
  • Get CGImage from a bitmap context
  • Get a pixel buffer, put a CGImage in it and add it to AssetWriter using AVAdaptor.

Good luck

EDIT

This may not work: I am testing it now and resubmitting as soon as possible.

0
source

Try VTCreateCGImageFromCVPixelBuffer from VideoToolbox (available for iOS 9.x +):

 OSStatus VTCreateCGImageFromCVPixelBuffer(CVPixelBufferRef pixelBuffer, CFDictionaryRef options, CGImageRef _Nullable *imageOut); Parameters 

Specify a pixel buffer that will be the source of image data for your CGImage (set options to NULL).

0
source

Source: https://habr.com/ru/post/1497852/


All Articles