OpenGL ES 2.0 and the default FrameBuffer in iOS

I'm a little confused about FrameBuffers. Currently, for drawing on the screen, I am generating a framebuffer using Renderbuffer for GL_COLOR_ATTACHMENT0 using this code.

 -(void)initializeBuffers{ //Build the main FrameBuffer glGenFramebuffers(1, &frameBuffer); glBindFramebuffer(GL_FRAMEBUFFER, frameBuffer); //Build the color Buffer glGenRenderbuffers(1, &colorBuffer); glBindRenderbuffer(GL_RENDERBUFFER, colorBuffer); //setup the color buffer with the EAGLLayer (it automatically defines width and height of the buffer) [context renderbufferStorage:GL_RENDERBUFFER fromDrawable:EAGLLayer]; glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &bufferWidth); glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &bufferHeight); //Attach the colorbuffer to the framebuffer glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, colorBuffer); //Check the Framebuffer status GLenum status = glCheckFramebufferStatus(GL_FRAMEBUFFER); NSAssert(status == GL_FRAMEBUFFER_COMPLETE, ERROR_FRAMEBUFFER_FAIL); } 

And I show the contents of the buffer with

 [context presentRenderbuffer:GL_RENDERBUFFER]; 

Considering this question , I saw a comment by Artu Peltonen, who says:

The default framebuffer, where you render by default, you don’t have to do anything to get it. Framebuffer objects are what you can render instead, and the other is off-screen rendering. If you do this, you will get your image in the texture instead of the default framebuffer (which is displayed on the screen). You can copy the image from this texture to the standard framebuffer (on the screen), this is usually done with blitting (but it is only available in OpenGL ES 3.0). But if you only wanted to show the image on the screen, you probably would not use FBO in the first place.

So, interestingly, my method is just used for off-screen rendering. And in this case, what should I do to display in the default buffer ?! (Note, I do not want to use GLKView ...)

+4
source share
1 answer

The OpenGL ES specification provides two types of framebuffers: objects equipped with a window system and framebuffers. The default badbuffer will be a view available for windows. But the specification does not require framebuffers equipped with a window system or a default framebuffer.

In iOS, there are no framebuffers equipped with a window system, and without a framebuffer by default - all drawings are performed using framebuffers. To display on the screen, you create a renderbuffer whose storage comes from the CAEAGLLayer object (or you use the one created on your behalf, as when using the GLKView class). This is exactly what your code does.

To render on the screen, you create a render visualizer and call glRenderbufferStorage to allocate storage for it. The specified repository is not associated with CAEAGLLayer , therefore renderbuffer cannot be (directly) displayed on the screen. (This is not a texture: setting the texture as a rendering object works differently - it's just a buffer off-screen.)

More information on all this and sample code for each approach in the Apple OpenGL ES Programming Guide for iOS .

+10
source

Source: https://habr.com/ru/post/1497851/


All Articles