I am trying to use the DepthBias property in the rasterizer state in DirectX 11 ( D3D11_RASTERIZER_DESC ) to help with the z-fight that occurs when rendering in wireframe mode over solid polygons (overlaying) and it seems that setting it to any value does not change anything to result. But I noticed something strange ... the value is defined as INT , not FLOAT . It doesn't make sense to me, but it still doesn't work properly. How to set this value correctly if it is an INT , which should be interpreted as UNORM in the shader pipeline?
That's what I'm doing:
- Show all geometry
- Install a rasterizer to render in the wireframe
- Repeat all geometry again
I clearly see the frame overlay, but the z-fight is terrible. I tried setting DepthBias to a lot of different values, such as 0.000001 , 0.1 , 1 , 10 , 1000 and all minus equivalents, still no results ... obviously, I know when you throw a float as a whole, all decimals are truncated. .. meh?
D3D11_RASTERIZER_DESC RasterizerDesc; ZeroMemory(&RasterizerDesc, sizeof(RasterizerDesc)); RasterizerDesc.FillMode = D3D11_FILL_WIREFRAME; RasterizerDesc.CullMode = D3D11_CULL_BACK; RasterizerDesc.FrontCounterClockwise = FALSE; RasterizerDesc.DepthBias = ??? RasterizerDesc.SlopeScaledDepthBias = 0.0f; RasterizerDesc.DepthBiasClamp = 0.0f; RasterizerDesc.DepthClipEnable = TRUE; RasterizerDesc.ScissorEnable = FALSE; RasterizerDesc.MultisampleEnable = FALSE; RasterizerDesc.AntialiasedLineEnable = FALSE;
How did you find DepthBias how to install DepthBias ? Or maybe this is a bug in DirectX (which I doubt), or maybe there is a better way to achieve this than using DepthBias ?
Thanks!
source share