OpenGL layer content on top of AVCaptureSession CVImageBufferRef from camera

I have two successful concepts that I want to combine now. I successfully split a CATextLayer into a CATextLayer camera frame and then saved it using AVAssetWriter using AVAssetWriterInputPixelBufferAdaptor . I do like this:

 - (void) processNewBuffer:(CVImageBufferRef)cameraFrame { NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init]; // // //update CALayer on main queue // //UIKit is not thread safe dispatch_sync(dispatch_get_main_queue(), ^{ [self updateCALayer]; }); if(recorder.recording) { CVPixelBufferLockBaseAddress(cameraFrame,0); // do stuff with buffer here uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(cameraFrame); size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame); width = CVPixelBufferGetWidth(cameraFrame); height = CVPixelBufferGetHeight(cameraFrame); /*Create a CGImageRef from the CVImageBufferRef*/ CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); [textLayer renderInContext:newContext]; [recorder appendPixelBuffer:cameraFrame withPresentationTime:camera.lastSampleTime]; CVPixelBufferUnlockBaseAddress(cameraFrame,0); /*We release some components*/ CGContextRelease(newContext); CGColorSpaceRelease(colorSpace); } [pool drain]; } 

It works like a charm. Now for my second trick. Thanks to the answer to this question:

OpenGL CMSampleBuffer for outputting video using AVAssestWritter

I can modify the OpenGL Teapot sample from the WWDC 2010 sample code and save the displayed content in a movie file on the iPhone.

NOW, I want to fold the kettle in one corner of the camera frame and save the bunch in the movie. The problem I encountered is the basic material C. How to copy from one buffer to the next when one buffer is 1280x720 (camera frame) and the kettle is in a buffer that is 320x320. Another consideration is speed. To process 30 frames per second, I cannot move from the CGImageRef or UIImage classes either. This should happen as quickly as possible. What is the best way to do this?

+6
source share
1 answer

You can try using the SpriteKit Framework (from iOS7), which runs on OpenGL and should support a high frame rate when working on images / textures.

Start looking for Apple: https://developer.apple.com/library/ios/documentation/GraphicsAnimation/Conceptual/SpriteKit_PG/Introduction/Introduction.html#//apple_ref/doc/uid/TP40013043-CH1-SW1

Hope this helps

0
source

Source: https://habr.com/ru/post/888565/


All Articles