I use ffmpeg YUV to render frames using the iOS 5.0 method "CVOpenGLESTextureCacheCreateTextureFromImage".
I use the GLCameraRipple apple as an example
My result on the iPhone screen is: iPhone screen
I need to know what I'm doing wrong.
I put part of my code to find errors.
ffmpeg configure frames:
ctx->p_sws_ctx = sws_getContext(ctx->p_video_ctx->width, ctx->p_video_ctx->height, ctx->p_video_ctx->pix_fmt, ctx->p_video_ctx->width, ctx->p_video_ctx->height, PIX_FMT_YUV420P, SWS_FAST_BILINEAR, NULL, NULL, NULL);
My visualization method:
if (NULL == videoTextureCache) { NSLog(@"displayPixelBuffer error"); return; } CVPixelBufferRef pixelBuffer; CVPixelBufferCreateWithBytes(kCFAllocatorDefault, mTexW, mTexH, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, buffer, mFrameW * 3, NULL, 0, NULL, &pixelBuffer); CVReturn err;
RenderWithSquareVertices Method
- (void)renderWithSquareVertices:(const GLfloat*)squareVertices textureVertices:(const GLfloat*)textureVertices {
}
My fragment shader:
uniform sampler2D SamplerY; uniform sampler2D SamplerUV; varying highp vec2 _texcoord; void main() { mediump vec3 yuv; lowp vec3 rgb; yuv.x = texture2D(SamplerY, _texcoord).r; yuv.yz = texture2D(SamplerUV, _texcoord).rg - vec2(0.5, 0.5);
Thank you very much,