Simple procedural skybox

In an attempt to create a very simple sky, I created a skybox (basically a cube going from (-1, -1, -1) to (1, 1, 1), which is drawn after all my geometry and forced feedback using the following simple vertex shader:

#version 330 layout(location = 0) in vec4 position; layout(location = 1) in vec4 normal; out Data { vec4 eyespace_position; vec4 eyespace_normal; vec4 worldspace_position; vec4 raw_position; } vtx_data; uniform mat4 model; uniform mat4 view; uniform mat4 projection; void main() { mat4 view_without_translation = view; view_without_translation[3][0] = 0.0f; view_without_translation[3][1] = 0.0f; view_without_translation[3][2] = 0.0f; vtx_data.raw_position = position; vtx_data.worldspace_position = model * position; vtx_data.eyespace_position = view_without_translation * vtx_data.worldspace_position; gl_Position = (projection * vtx_data.eyespace_position).xyww; } 

From this, I am trying to show my sky as a very simple gradient from deep blue at the top to light blue on the horizon.

Obviously, simply mixing two colors based on the Y coordinate of each fragment will look very bad: the fact that you are looking at the box and not at the dome immediately becomes clear, as shown here:

wrong skybox

Pay attention to the fairly visible "corners" in the upper left and upper right corner of the window.

Instinctively, I thought that the obvious solution would be to normalize the position of each fragment, get a position on the unit sphere, then take the Y coordinate from this. I thought this would lead to a value that would be constant for a given "height" if that makes sense. Like this:

 #version 330 in Data { vec4 eyespace_position; vec4 eyespace_normal; vec4 worldspace_position; vec4 raw_position; } vtx_data; out vec4 outputColor; const vec4 skytop = vec4(0.0f, 0.0f, 1.0f, 1.0f); const vec4 skyhorizon = vec4(0.3294f, 0.92157f, 1.0f, 1.0f); void main() { vec4 pointOnSphere = normalize(vtx_data.worldspace_position); float a = pointOnSphere.y; outputColor = mix(skyhorizon, skytop, a); } 

The result, however, is almost the same as the first screenshot (I can post it if necessary, but since it is visually similar to the first, I skip it to trim this question right now).

After some random attempts (programming the cult of the load, I know: /), I realized that this works:

 void main() { vec3 pointOnSphere = normalize(vtx_data.worldspace_position.xyz); float a = pointOnSphere.y; outputColor = mix(skyhorizon, skytop, a); } 

The only difference is that I normalize the position without the W component.

And here is the working result: (the difference is subtle in the screenshots, but quite noticeable in movement) correct skybox

So finally, my question is: why does this work when the previous version crashes? I should be misunderstanding something extremely elementary about uniform coordinates, but my brain just doesn't click right now!

+4
source share
1 answer

GLSL normalize does not handle uniform coordinates as such. He interprets the coordinate as belonging to R^4 . This is not what you want at all. However, if vtx_data.worldspace_position.w == 0 , then normalization should give the same result.

I do not know what vec3 pointOnSphere = normalize(vtx_data.worldspace_position); , because the left side must also be of type vec4 .

+3
source

Source: https://habr.com/ru/post/1403358/


All Articles