I am using the SharpDX Toolkit and I am trying to create a Texture2D programmatically, so I can manually specify all the pixel values. And I'm not sure what pixel format to create it.
SharpDX doesn't even document the PixelFormat toolkit (they have documentation for another PixelFormat class , but it's for WIC, not toolkit), I found the DirectX enum that it wraps, DXGI_FORMAT , but its documentation does not give any useful recommendations as to how I chose the format.
I'm used to the old 32-bit raster image formats with 8 bits per color channel plus 8-bit alpha, which is good enough for me. Therefore, I assume that the simplest choice would be R8G8B8A8 or B8G8R8A8. Does what I choose matter? Will they both be fully supported on all hardware?
And even after I selected one of them, I need to additionally indicate whether they will be SInt, SNorm, Typeless, UInt, UNorm or UNormSRgb. I do not need the sRGB color space. I donβt understand what should be Typical. UInt seems the simplest - just a plain old unsigned byte - but it turns out that it doesn't work; I am not getting an error, but my texture does not draw anything on the screen. UNorm works, but there is nothing in the documentation that explains why UInt does not. So, now I'm paranoid that UNorm may not work on any other graphics card.
Here is the code that I have if someone wants to see it. Download the full SharpDX package, open the SharpDXToolkitSamples project, go to the SpriteBatchAndFont.WinRTXaml project, open the SpriteBatchAndFontGame class and add the code where it says:
This draws a small rectangle with diagonal lines in the upper left corner of the screen. It works on a laptop, on which I am testing it, but I have no idea how to find out if this means that it will work everywhere, and I donβt know if it will be the most effective.
What pixel format should I use to make sure that my application will run on all hardware and get maximum performance?