How to implement grayscale rendering in OpenGL?

When rendering a scene with textured polygons, I would like to be able to switch between rendering in the original colors and grayscale. I tried to achieve this using blending operations and a color matrix; none of this worked (when mixing, I could not find glBlendFunc () that achieved something remotely resembling what I wanted, and color matrix operations ... are discussed here ).

The solution that comes to mind (but also quite expensive) is to capture the screen of each frame and convert the resulting texture to shades of gray and display it instead ... (Where I said shades of gray, I actually meant that Either with low saturation, but I assume that for most possible solutions it will not differ much from shades of gray).

What other parameters do I have?

+5
source share
3 answers

OpenGL badbuffer uses the RGB color space by default, which does not preserve explicit saturation. You need an approach to extract saturation, change it and change it.

My previous sentence, which simply used the length of an RGB vector to represent 0 in brightness, was incorrect, as it did not take scaling into account. I apologize.

The credit for the new short fragment goes to regular users "RTFM_FTW" from ## opengl and ## opengl3 to FreeNode / IRC, and it allows you to directly change the saturation without calculating the expensive conversion RGB-> HSV-> RGB, what exactly do you want. Although the HSV code is inferior in your question, I let it stay.

void main( void ) { vec3 R0 = texture2DRect( S, gl_TexCoord[0].st ).rgb; gl_FragColor = vec4( mix( vec3( dot( R0, vec3( 0.2125, 0.7154, 0.0721 ) ) ), R0, T ), gl_Color.a ); } 

If you need more control than just saturation, you need to convert to HSL or HSV color space. As shown below, using the GLSL fragment shader.

Read the OpenGL 3.0 and GLSL 1.30 specifications available at http://www.opengl.org/registry to learn how to use GLSL v1.30 functionality.

 #version 130 #define RED 0 #define GREEN 1 #define BLUE 2 in vec4 vertexIn; in vec4 colorIn; in vec2 tcoordIn; out vec4 pixel; Sampler2D tex; vec4 texel; const float epsilon = 1e-6; vec3 RGBtoHSV(vec3 color) { /* hue, saturation and value are all in the range [0,1> here, as opposed to their normal ranges of: hue: [0,360>, sat: [0, 100] and value: [0, 256> */ int sortindex[3] = {RED,GREEN,BLUE}; float rgbArr[3] = float[3](color.r, color.g, color.b); float hue, saturation, value, diff; float minCol, maxCol; int minIndex, maxIndex; if(color.g < color.r) swap(sortindex[0], sortindex[1]); if(color.b < color.g) swap(sortindex[1], sortindex[2]); if(color.r < color.b) swap(sortindex[2], sortindex[0]); minIndex = sortindex[0]; maxIndex = sortindex[2]; minCol = rgbArr[minIndex]; maxCol = rgbArr[maxIndex]; diff = maxCol - minCol; /* Hue */ if( diff < epsilon){ hue = 0.0; } else if(maxIndex == RED){ hue = ((1.0/6.0) * ( (color.g - color.b) / diff )) + 1.0; hue = fract(hue); } else if(maxIndex == GREEN){ hue = ((1.0/6.0) * ( (color.b - color.r) / diff )) + (1.0/3.0); } else if(maxIndex == BLUE){ hue = ((1.0/6.0) * ( (color.r - color.g) / diff )) + (2.0/3.0); } /* Saturation */ if(maxCol < epsilon) saturation = 0; else saturation = (maxCol - minCol) / maxCol; /* Value */ value = maxCol; return vec3(hue, saturation, value); } vec3 HSVtoRGB(vec3 color) { float f,p,q,t, hueRound; int hueIndex; float hue, saturation, value; vec3 result; /* just for clarity */ hue = color.r; saturation = color.g; value = color.b; hueRound = floor(hue * 6.0); hueIndex = int(hueRound) % 6; f = (hue * 6.0) - hueRound; p = value * (1.0 - saturation); q = value * (1.0 - f*saturation); t = value * (1.0 - (1.0 - f)*saturation); switch(hueIndex) { case 0: result = vec3(value,t,p); break; case 1: result = vec3(q,value,p); break; case 2: result = vec3(p,value,t); break; case 3: result = vec3(p,q,value); break; case 4: result = vec3(t,p,value); break; default: result = vec3(value,p,q); break; } return result; } void main(void) { vec4 srcColor; vec3 hsvColor; vec3 rgbColor; texel = Texture2D(tex, tcoordIn); srcColor = texel*colorIn; hsvColor = RGBtoHSV(srcColor.rgb); /* You can do further changes here, if you want. */ hsvColor.g = 0; /* Set saturation to zero */ rgbColor = HSVtoRGB(hsvColor); pixel = vec4(rgbColor.r, rgbColor.g, rgbColor.b, srcColor.a); } 
+9
source

If you work against modern OpenGL, I would say that pixel shaders are a very suitable solution. Either connecting to each shading of the polygon as they render, or by performing a single full-screen square in the second pass, which simply reads each pixel, is converted to shades of gray and writes it back. If your resolution, graphics hardware, and target frame rate are not "extreme", in most cases this should be done in most cases.

+2
source

For most desktops, Render-To-Texture is no longer so expensive, all compilers, aero, etc. and effects, such as bloom or depth of field, visible in the latest headings, depend on it.

In fact, you are not converting the screen texture as such to shades of gray, you want to draw a square square with a texture and a fragmented shader that converts valions to shades of gray.

Another option is to have two sets of fragment shaders for your triangles, one of which simply copies the gl_FrontColor attribute as a fixed pieline function, and the other that writes grayscale values ​​to the screen buffer.

The third option can be indexed by color modes if you set uΓΌp to the grayscale palette, but this mode may be outdated and poorly supported at the moment; plus you lose a lot of functionality, for example, mixing, if I remember correctly.

+2
source

Source: https://habr.com/ru/post/910488/


All Articles