OpenGL Texture from CALayer (AVPlayerLayer)

I have an AVPlayerLayer from which I would like to create an OpenGL Texture. I am comfortable working with opengl textures and even comfortable with converting CGImageRef to opengl texture. It seems to me that the code below should work, but I get just black. What am I doing wrong? Do I need to set any properties on CALayer / AVPlayerLayer first?

CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); int width = (int)[layer bounds].size.width; int height = (int)[layer bounds].size.height; CGContextRef context = CGBitmapContextCreate(NULL, width, height, 8, width * 4, colorSpace, kCGImageAlphaPremultipliedLast); CGColorSpaceRelease(colorSpace); if (context== NULL) { ofLog(OF_LOG_ERROR, "getTextureFromLayer: failed to create context 1"); return; } [[layer presentationLayer] renderInContext:context]; CGImageRef cgImage = CGBitmapContextCreateImage(context); int bytesPerPixel = CGImageGetBitsPerPixel(cgImage)/8; if(bytesPerPixel == 3) bytesPerPixel = 4; GLubyte *pixels = (GLubyte *) malloc(width * height * bytesPerPixel); CGContextRelease(context); context = CGBitmapContextCreate(pixels, width, height, CGImageGetBitsPerComponent(cgImage), width * bytesPerPixel, CGImageGetColorSpace(cgImage), kCGImageAlphaPremultipliedLast); if(context == NULL) { ofLog(OF_LOG_ERROR, "getTextureFromLayer: failed to create context 2"); free(pixels); return; } CGContextDrawImage(context, CGRectMake(0.0, 0.0, width, height), cgImage); int glMode; switch(bytesPerPixel) { case 1: glMode = GL_LUMINANCE; break; case 3: glMode = GL_RGB; break; case 4: default: glMode = GL_RGBA; break; } if(texture.bAllocated() == false || texture.getWidth() != width || texture.getHeight() != height) { NSLog(@"getTextureFromLayer: allocating texture %i, %i\n", width, height); texture.allocate(width, height, glMode, true); } // test texture // for(int i=0; i<width*height*4; i++) pixels[i] = ofRandomuf() * 255; texture.loadData(pixels, width, height, glMode); CGContextRelease(context); CFRelease(cgImage); free(pixels); 

PS The variable "texture" is a C ++ opengl (-es compatible) texture object, which, as I know, works. If I uncomment the β€œtest texture” for the loop, filling the texture with random noise, I see this, so the problem is definitely earlier.

UPDATE

In response to Nick Weaver's answer, I tried a different approach, and now I always get NULL back from copyNextSampleBuffer with status == 3 (AVAssetReaderStatusFailed). Did I miss something?

Variables

 AVPlayer *videoPlayer; AVPlayerLayer *videoLayer; AVAssetReader *videoReader; AVAssetReaderTrackOutput*videoOutput; 

Init

  videoPlayer = [[AVPlayer alloc] initWithURL:[NSURL fileURLWithPath:[NSString stringWithUTF8String:videoPath.c_str()]]]; if(videoPlayer == nil) { NSLog(@"videoPlayer == nil ERROR LOADING %s\n", videoPath.c_str()); } else { NSLog(@"videoPlayer: %@", videoPlayer); videoLayer = [[AVPlayerLayer playerLayerWithPlayer:videoPlayer] retain]; videoLayer.frame = [ThreeDView instance].bounds; // [[ThreeDView instance].layer addSublayer:videoLayer]; // test to see if it loading and running AVAsset *asset = videoPlayer.currentItem.asset; NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo]; NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], (NSString*)kCVPixelBufferPixelFormatTypeKey, nil]; videoReader = [[AVAssetReader alloc] initWithAsset:asset error:nil]; videoOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:[tracks objectAtIndex:0] outputSettings:settings]; [videoReader addOutput:videoOutput]; [videoReader startReading]; } 

draw loop

  if(videoPlayer == 0) { ofLog(OF_LOG_WARNING, "Shot::drawVideo: videoPlayer == 0"); return; } if(videoOutput == 0) { ofLog(OF_LOG_WARNING, "Shot::drawVideo: videoOutput == 0"); return; } CMSampleBufferRef sampleBuffer = [videoOutput copyNextSampleBuffer]; if(sampleBuffer == 0) { ofLog(OF_LOG_ERROR, "Shot::drawVideo: sampleBuffer == 0, status: %i", videoReader.status); return; } CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CFRelease(sampleBuffer); CVPixelBufferLockBaseAddress(imageBuffer,0); unsigned char *pixels = ( unsigned char *)CVPixelBufferGetBaseAddress(imageBuffer); int width = CVPixelBufferGetWidth(imageBuffer); int height = CVPixelBufferGetHeight(imageBuffer); if(videoTexture.bAllocated() == false || videoTexture.getWidth() != width || videoTexture.getHeight() != height) { NSLog(@"Shot::drawVideo() allocating texture %i, %i\n", width, height); videoTexture.allocate(width, height, GL_RGBA, true); } videoTexture.loadData(pixels, width, height, GL_BGRA); CVPixelBufferUnlockBaseAddress(imageBuffer,0); 
+6
source share
1 answer

I think iOS4: how to use a video file as an OpenGL texture? will be helpful for your question.

+2
source

Source: https://habr.com/ru/post/886583/


All Articles