Draw a texture in OpenGL ignoring its alpha channel

I have a texture loaded into memory that has RGBA format with different alpha values.

The image is loaded like this:

 GLuint texture = 0;
 glGenTextures(1, &texture);
 glBindTexture(GL_TEXTURE_2D, texture);
 self.texNum = texture;

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR); 
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR); 

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, self.imageWidth, self.imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, [self.imageData bytes]);

I want to know how I can draw this texture so that the alpha channel in the image is processed as all 1 and the texture is painted as an RGB image.

Consider the base image:

http://www.ldeo.columbia.edu/~jcoplan/alpha/base.png

This image is a progression from 0 to 255 alpha and has an RGB value of 255.0.0 throughout

However, if I draw it with the mixing turned off, I get an image that looks like this: www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png

When I really want, this is an image that looks like this: www.ldeo.columbia.edu/~jcoplan/alpha/correct.png

, -. , RGB , - .

: GL_COMBINE :

glColorf(1,1,1,1);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE);

glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_TEXTURE);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR);

glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_ALPHA, GL_PRIMARY_COLOR);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA); 
[self drawTexture];

, .

+3
4

, , RGBA

glDisable (GL_BLEND)

, , , : www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png

, . / , , , OpenGL.

, glTexEnv (GL_COMBINE...) (.. -), , , . Direct3D9 ( D3DTOP_MODULATEALPHA_ADDCOLOR), , , opengl.

+3

, glBlendFunc :

glBlendFunc(GL_ONE, GL_ZERO);
+1

, OpenGL RGB ,

glPixelStorei(GL_UNPACK_ALIGNMENT, 4)

glTexImage2D , GL_RGB. , -.

0
source

I had a similar problem, and it turned out that this was due to the fact that loading iOS images did a preliminary estimation of RBG values ​​(as discussed in some other answers and comments here). I would like to know if there is a way to disable pre-multiplication, but at the same time I am "un-pre-multiplying" using the code obtained from this thread and this thread .

    // un pre-multiply
    uint8_t *imageBytes = (uint8_t *)imageData ;
    int byteCount = width*height*4 ;
    for (int i=0; i < byteCount; i+= 4) {
        uint8_t a = imageBytes[i+3] ;
        if (a!=255 && a!=0 ){
            float alphaFactor = 255.0/a ;
            imageBytes[i] *= alphaFactor ; 
            imageBytes[i+1] *= alphaFactor ; 
            imageBytes[i+2] *= alphaFactor ; 
        }
    }
0
source

Source: https://habr.com/ru/post/1753741/


All Articles