3 solutions to restore the position of the viewing space in perspective projection
The projection matrix describes the mapping from three-dimensional points of the scene to two-dimensional points of the viewing area. It is transformed from the view space (eyes) to the clip space, and the coordinates in the clip space are converted to the normalized device coordinates (NDC) by dividing the clip coordinates by the w component. NDCs are in the range (-1, -1, -1) to (1,1,1).
In perspective projection, the projection matrix describes the mapping from three-dimensional points in the world, as they are visible from a pinhole camera, into two-dimensional points of the viewport.
The coordinates of the eye space in the truncation of the camera (truncated pyramid) are displayed in a cube (normalized coordinates of the device).
Perspective projection matrix:
r = right, l = left, b = bottom, t = top, n = near, f = far 2*n/(rl) 0 0 0 0 2*n/(tb) 0 0 (r+l)/(rl) (t+b)/(tb) -(f+n)/(fn) -1 0 0 -2*f*n/(fn) 0
should:
aspect = w / h tanFov = tan( fov_y * 0.5 ); prjMat[0][0] = 2*n/(rl) = 1.0 / (tanFov * aspect) prjMat[1][1] = 2*n/(tb) = 1.0 / tanFov
In a perspective projection, the Z-component is calculated by a rational function :
z_ndc = ( -z_eye * (f+n)/(fn) - 2*f*n/(fn) ) / -z_eye
Depth ( gl_FragCoord.z
and gl_FragDepth
) is calculated as follows:
z_ndc = clip_space_pos.z / clip_space_pos.w; depth = (((farZ-nearZ) * z_ndc) + nearZ + farZ) / 2.0;
1. Field of view and aspect ratio
Since the projection matrix is ββdetermined by the field of view and aspect ratio, you can restore the position of the viewing area with the field of view and aspect ratio. Provided that this is a symmetrical perspective projection and the normalized coordinates of the device, the depth and the near and far planes are known.
Restore distance Z in the field of view:
z_ndc = 2.0 * depth - 1.0; z_eye = 2.0 * n * f / (f + n - z_ndc * (f - n));
Restore the position of the viewport to the normalized coordinates of the XY device:
ndc_x, ndc_y = xy normalized device coordinates in range from (-1, -1) to (1, 1): viewPos.x = z_eye * ndc_x * aspect * tanFov; viewPos.y = z_eye * ndc_y * tanFov; viewPos.z = -z_eye;
2. Projection matrix
Projection parameters determined by the field of view and aspect ratio are stored in the projection matrix. Therefore, the position of the viewport can be restored from the values ββof the projection matrix from the symmetrical projection of the perspective.
Pay attention to the relationship between the projection matrix, the field of view and the aspect ratio:
prjMat[0][0] = 2*n/(rl) = 1.0 / (tanFov * aspect); prjMat[1][1] = 2*n/(tb) = 1.0 / tanFov; prjMat[2][2] = -(f+n)/(fn) prjMat[3][2] = -2*f*n/(fn)
Restore distance Z in the field of view:
A = prj_mat[2][2]; B = prj_mat[3][2]; z_ndc = 2.0 * depth - 1.0; z_eye = B / (A + z_ndc);
Restore the position of the viewport to the normalized coordinates of the XY device:
viewPos.x = z_eye * ndc_x / prjMat[0][0]; viewPos.y = z_eye * ndc_y / prjMat[1][1]; viewPos.z = -z_eye;
3. Back projection matrix
Of course, the position of the viewport can be restored using the matrix of the rear projection.
mat4 inversePrjMat = inverse( prjMat ); vec4 viewPosH = inversePrjMat * vec3( ndc_x, ndc_y, 2.0 * depth - 1.0, 1.0 ) vec3 viewPos = viewPos.xyz / viewPos.w;
See also the answers to the following question: