OpenGL draws in 16 colors after changing RGBA color

When I try to draw a 2D circle in OpenGL with RGBA color, it draws it with the closest color of the 16-color palette. Here is the code I'm using.

// Init canvas glMatrixMode(GL_PROJECTION); glLoadIdentity(); glOrtho(0,Screen.Width(),Screen.Height(),0,0,1); glMatrixMode(GL_MODELVIEW); glColorMaterial(GL_FRONT,GL_AMBIENT_AND_DIFFUSE); glEnable(GL_COLOR_MATERIAL); // Background glClearColor(0.0,0.0,0.0,1.0); glShadeModel(GL_SMOOTH); glClear(GL_COLOR_BUFFER_BIT); [...] glColor3f(Color.R,Color.G,Color.B); glBegin(GL_TRIANGLE_FAN); glVertex2f(Pos.X - SX,Pos.Y - SY); for (int angle=0; angle <= 360; angle+=1) glVertex2f(Pos.X - SX + sin(angle*M_PI/180.0) * Size, Pos.Y - SY + cos(angle*M_PI/180.0) * Size); glEnd(); [...] // Render glFlush(); glDisable(GL_COLOR_MATERIAL); 

Color is a structure of type Color :

 struct Color { float R; float G; float B; float A; void operator =(Color Clr); bool operator ==(Color Clr); }; 

The following code is used to configure the engine:

 // Create context GDC = GetDC(Handle); // Create pixel format descriptor PIXELFORMATDESCRIPTOR GPFD = { sizeof(PIXELFORMATDESCRIPTOR), 1, PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER, PFD_TYPE_RGBA, 24, 0,0,0,0,0,0, 0,0, 0,0,0,0,0, 32, 0, 0, PFD_MAIN_PLANE, 0, 0,0,0 }; GPixelFormat = ChoosePixelFormat(GDC,&GPFD); SetPixelFormat(GDC,GPixelFormat,&GPFD); // Create resource GRC = wglCreateContext(GDC); wglMakeCurrent(GDC,GRC); // Setup resource glClearColor(1.0f, 1.0f, 1.0f, 1.0f); glClear(GL_COLOR_BUFFER_BIT); glFlush(); 

What am I doing wrong?

UPDATE:

When I debug this code and read the pixel format using DescribePixelFormat() , the debugger outputs the following: https://dl.dropboxusercontent.com/u/12669217/Debugger.jpg In addition, the PFD_GENERIC_FORMAT and PFD_NEED_PALETTE not set.

This is the desired result (before I used OpenGL): https://dl.dropboxusercontent.com/u/12669217/CR_Desired.png

This is the actual output (text and background not yet implemented): https://dl.dropboxusercontent.com/u/12669217/CR_Actual.png

+4
source share
1 answer

In WGL, the pixel format descriptor PFD_TYPE_RGBA will provide you with a pixel format that uses bit planes. When using RGBA, there are no color palettes, however, this mode depends on the number of bits that you assign to the plane and any offset.

I would suggest using:

 PIXELFORMATDESCRIPTOR GPFD = { sizeof(PIXELFORMATDESCRIPTOR), 1, PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER, PFD_TYPE_RGBA, 24, 0,0,0,0,0,0, // No Shift Bits or Arbitrary Bitplane Allocation 0,0, 0,0,0,0,0, 32, 0, 0, PFD_MAIN_PLANE, 0, 0,0,0 }; 

I suspect that the shift bit fields are responsible for your unusual behavior. If you set too many of these values ​​in an unusual way, you will be presented with a pixel format that will not be hardware accelerated. For example, a 32-bit z-buffer is not supported on a variety of hardware (24-bit Z + 8-bit stencil is much more compatible).

What you REALLY do is call DescribePixelFormat (...) after choosing the pixel format and see what WGL actually gave you. WGL is looking for the pixel format that most closely matches the format you requested.


Take a look at the ChoosePixelFormat link on MSDN. It states the following:

Notes

You must make sure that the pixel format corresponding to the SelectPixelFormat function meets your requirements. For example, if you request a pixel format with a 24-bit RGB color buffer, but the device context offers only 8-bit RGB color buffers, the function returns a pixel format with an 8-bit RGB color buffer.


UPDATE:

Add this code to your setup and examine the GPFD data GPFD in the debugger:

 DescribePixelFormat (GPixelFormat, &GPFD); 

Pay particular attention to fields like cColorBits and cDepthBits .

There are a couple of flags that you should also check, but the debugger is not going to do this easily. In your code, you should check:

  • GPFD.dwFlags & PFD_GENERIC_FORMAT
  • GPFD.dwFlags & PFD_NEED_PALETTE
+3
source

Source: https://habr.com/ru/post/1499238/


All Articles