GlReadPixels and alpha channels returning 1.0

I read the pixel data from the framebuffer and everything works, except for the alpha value, which is always 1.0

  GLfloat lebuf[areasize * 4];
  glReadPixels(xstart, ystart, partw, parth, GL_RGBA, GL_FLOAT, lebuf);

I set the window creation code to support the alpha channel:

  SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8);

Is there some other place I should look at to check why the alpha channel seems to be 1.0 all the time? Even better, is there any other way (other than glReadPixels) to get the texture into client memory from the framebuffer?

edit : this is how I clear the buffer:

 glClearColor(0,0,0,0);
 glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
+3
source share
3 answers

Could you check:

  • What did you accept your SDL format ( glGetIntegerv(GL_ALPHA_BITS, bits))?
  • , , 1 (glClearColor). , 0,5 . 0,5?
  • , -, (glColorMask(GL_TRUE,GL_TRUE,GL_TRUE,GL_TRUE))?
  • - ?
  • blend ?
+6

GLUT, , :

glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_ALPHA | GLUT_DEPTH | GLUT_STENCIL);

glReadPixels - = 1.

+1

Please use the following line .. the problem will be solved.

glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_ALPHA | GLUT_DEPTH | GLUT_STENCIL);
0
source

Source: https://habr.com/ru/post/1786124/


All Articles