Chromatic Aberration Reflection / Refraction - Eye Correction

I am writing a GLSL shader that simulates chromatic aberration for simple objects. I remain compatible with OpenGL 2.0, so I use the built-in OpenGL stack stack. This is a simple vertex shader:

uniform vec3 cameraPos; varying vec3 incident; varying vec3 normal; void main(void) { vec4 position = gl_ModelViewMatrix * gl_Vertex; incident = position.xyz / position.w - cameraPos; normal = gl_NormalMatrix * gl_Normal; gl_Position = ftransform(); } 

The cameraPos uniform is the position of the camera in the model space, as you might imagine. Here is a fragment of a shader:

 const float etaR = 1.14; const float etaG = 1.12; const float etaB = 1.10; const float fresnelPower = 2.0; const float F = ((1.0 - etaG) * (1.0 - etaG)) / ((1.0 + etaG) * (1.0 + etaG)); uniform samplerCube environment; varying vec3 incident; varying vec3 normal; void main(void) { vec3 i = normalize(incident); vec3 n = normalize(normal); float ratio = F + (1.0 - F) * pow(1.0 - dot(-i, n), fresnelPower); vec3 refractR = vec3(gl_TextureMatrix[0] * vec4(refract(i, n, etaR), 1.0)); vec3 refractG = vec3(gl_TextureMatrix[0] * vec4(refract(i, n, etaG), 1.0)); vec3 refractB = vec3(gl_TextureMatrix[0] * vec4(refract(i, n, etaB), 1.0)); vec3 reflectDir = vec3(gl_TextureMatrix[0] * vec4(reflect(i, n), 1.0)); vec4 refractColor; refractColor.ra = textureCube(environment, refractR).ra; refractColor.g = textureCube(environment, refractG).g; refractColor.b = textureCube(environment, refractB).b; vec4 reflectColor; reflectColor = textureCube(environment, reflectDir); vec3 combinedColor = mix(refractColor, reflectColor, ratio); gl_FragColor = vec4(combinedColor, 1.0); } 

environment is a cubic map that is displayed in real time from the environment of the drawn object.

Under normal circumstances, the shader behaves (as I think) as expected, giving this result:

correct shader behavior

However, when the camera rotates 180 degrees around its target, so that now it points to an object on the other hand, the refracted / reflected image becomes equally distorted (this happens gradually for angles between 0 and 180 degrees, of course):

incorrect shader behavior

Similar artifacts appear when the camera goes down / up; it only works 100% correctly if the camera is directly above the target (in this case, it indicates a negative Z).

It’s hard for me to understand which transformation in the shader is responsible for this distorted image, but there must be something obvious related to how cameraPos processed. What makes an image deform itself in this way?

+6
source share
1 answer

This looks suspicious to me:

 vec4 position = gl_ModelViewMatrix * gl_Vertex; incident = position.xyz / position.w - cameraPos; 

Is your cameraPos in global space? You subtract the position vector from the presumably cameraPos world cameraPos . You either need to do the calculations in the space of the world, or in the viewing space, but you cannot mix them.

To do this correctly in world space, you will have to download the model matrix separately to get the vector of the falling space of the world.

+2
source

Source: https://habr.com/ru/post/911511/


All Articles