WebGL: Do null alpha pixels in a texture necessarily update the depth buffer?

I am trying to display a texture containing fully transparent pixels and fully opaque pixels. It seems that my only option is to make them remote with respect to the front (after all completely opaque polygons), because even completely transparent pixels update the depth buffer, but I'm not sure if that is accurate. It?

It seems that for the fragment shader it was possible to leave a depth buffer only for transparent pixels, but apparently not. Did I miss something?

Is there any other reasonable way to render images with a very irregular shape, different from many polygons that I don’t think about?

+4
source share
2 answers

Now that I have asked a question, I seem to have found at least one answer. The discard statement in the fragment shader seems to do what I need:

 void main(void) { vec4 textureColor = texture2D(uSampler, vTextureCoord); if (textureColor.a < 0.5) discard; else gl_FragColor = vec4(textureColor.rgb * vLighting, textureColor.a); } 
+8
source

When you reach the fragment shader, a decision has already been made that there will be a pixel, regardless of whether you give out alpha or not. The only way to avoid this is to use options like glEnable (GL_ALPHA_TEST). However, this only applies for alpha values ​​of 0 or 1, for example, when drawing sprites with hard borders. If you need soft frames or you want translucent objects, you need to go back. And you may need geometry for this.

0
source

Source: https://habr.com/ru/post/1433177/


All Articles