Colors in vertex buffer objects - DirectX vs OpenGL

I am working on a small game engine. One of the features of this is that it must support DirectX and OpenGL rendering.

I use Vertex Buffer Objects and I have a structure to determine the format of my vertices. The problem is that I would like to be able to use the same structure for both DirectX and OpenGL so that I can switch from my DirectX rendering component to OpenGL without changing the vertices of my objects.

Is it possible?

I am currently using the following structure for DirectX:

struct Vertex{ float position[3]; // x, y, z float normal[3]; // nx, ny, nz DWORD colour; // The vertex color float texture[2]; // u, v }; 

together with:

 #define D3DFVF_CUSTOMVERTEX (D3DFVF_XYZ | D3DFVF_NORMAL | D3DFVF_DIFFUSE | D3DFVF_TEX1 ) 

as a flexible vertex format.

My understanding of OpenGL tells me that when I want to draw my object, I can say the color with this command:

 glColorPointer(4, GL_FLOAT, sizeof(Vertex), BUFFER_OFFSET24); 

in the drawing procedure, assuming that the color has 4 components. And really, it works. However, I do not believe that I can tell OpenGL to use an unsigned integer for this task, so I use:

 struct Vertex{ float position[3]; // x, y, z float normal[3]; // nx, ny, nz float colour[4]; // r, g, b, a float texture[2]; // u, v }; 

this structure is for my OpenGL code.

+4
source share
2 answers

What you need to do is use the extension ARB_vertex_array_bgra. It is specifically designed for D3D interoperability.

So your glColorPointer call will look like this:

 glColorPointer(GL_BGRA, GL_UNSIGNED_BYTE, sizeof(Vertex), BUFFER_OFFSET12); 
+4
source

A DWORD is a 32-bit unsigned integer. As far as I know, directx reads it as 4 sets, of 8 bits, each of which represents a color channel.

You can do the same in OpenGL by telling it to read unsigned bytes instead of floats:

 glColorPointer(4, GL_UNSIGNED_BYTE, sizeof(Vertex), BUFFER_OFFSET12); 

You will need to change the OpenGL structure to

 struct Vertex{ float position[3]; // x, y, z float normal[3]; // nx, ny, nz unsigned int colour; float texture[2]; // u, v }; 
+1
source

Source: https://habr.com/ru/post/1383207/


All Articles