How can I get Alpha blending transparency in OpenGL ES 2.0?

I'm in the middle of porting some code from OpenGL ES 1.x to OpenGL ES 2.0, and I'm trying my best to achieve transparency, as it was before; all my triangles are completely opaque.

My OpenGL setup has the following lines:

// Draw objects back to front glDisable(GL_DEPTH_TEST); glEnable(GL_BLEND); glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); glDepthMask(false); 

And my shaders look like this:

 attribute vec4 Position; uniform highp mat4 mat; attribute vec4 SourceColor; varying vec4 DestinationColor; void main(void) { DestinationColor = SourceColor; gl_Position = Position * mat; } 

and this:

 varying lowp vec4 DestinationColor; void main(void) { gl_FragColor = DestinationColor; } 

What could be wrong?

EDIT: If I set the alpha in the fragment shader manually to 0.5 in the fragment shader (or even in the vertex shader) as suggested by keaukraine below , then I'm all transparent. Also, if I change the color values, I go to OpenGL as a float instead of unsigned bytes, then the code works correctly.

So, there seems to be something wrong with the code conveying color information in OpenGL, and I would still like to know what the problem is.

My vertices were defined like this (no change from OpenGL ES 1.x code):

 typedef struct { GLfloat x, y, z, rhw; GLubyte r, g, b, a; } Vertex; 

And I used the following code to pass them to OpenGL (similar to OpenGL ES 1.x):

 glBindBuffer(GL_ARRAY_BUFFER, glTriangleVertexBuffer); glBufferData(GL_ARRAY_BUFFER, sizeof(Vertex) * nTriangleVertices, triangleVertices, GL_STATIC_DRAW); glUniformMatrix4fv(matLocation, 1, GL_FALSE, m); glVertexAttribPointer(positionSlot, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*)offsetof(Vertex, x)); glVertexAttribPointer(colorSlot, 4, GL_UNSIGNED_BYTE, GL_FALSE, sizeof(Vertex), (void*)offsetof(Vertex, r)); glDrawArrays(GL_TRIANGLES, 0, nTriangleVertices); glBindBuffer(GL_ARRAY_BUFFER, 0); 

What is wrong with that?

+4
source share
3 answers

The vertex attribute values โ€‹โ€‹of your color are not normalized. This means that the vertex shader sees values โ€‹โ€‹for this attribute in the range 0-255.

Change the fourth argument of glVertexAttribPointer to GL_TRUE and the values โ€‹โ€‹will be normalized (scaled to the range 0.0-1.0), as you would expect.

see http://www.khronos.org/opengles/sdk/docs/man/xhtml/glVertexAttribPointer.xml

+5
source

I suspect that DestinationColor, which is different from your fragment shader, always contains 0xFF for the alpha channel? If so, this is your problem. Try changing this so that alpha really changes.

Update: we found 2 good solutions:

  • Use floats instead of unsigned bytes for changes that are passed to the DestinationColor in the fragment shader.

  • Or, as GuyRT pointed out, you can change the fourth argument of glVertexAttribPointer to GL_TRUE to tell OpenGL ES to normalize values โ€‹โ€‹when they are converted from integers to floats.

+2
source

To debug this situation, you can try setting a constant alpha and see if it matters:

 varying lowp vec4 DestinationColor; void main(void) { gl_FragColor = DestinationColor; gl_FragColor.a = 0.5; /* try other values from 0 to 1 to test blending */ } 

You must also make sure that you select an EGL configuration with an alpha channel.

And do not forget to specify the accuracy for floats in flash shaders! View Specifications for OpenGL GL | ES 2.0 ( http://www.khronos.org/registry/gles/specs/2.0/GLSL_ES_Specification_1.0.17.pdf , section 4.5.3), and please see this answer: fooobar.com/questions/129297/ ...

+1
source

Source: https://habr.com/ru/post/1485856/


All Articles