What I am experiencing is my compiler's failure to give my unsigned pointer to a char signed pointer. I was a bit embarrassed because I used static_cast to convert my subscription for the longest time.
Then I dug a bit (well, it wasnโt very deep, I scooped it up a bit!), And although now I understand that preventing customization of the static_cast pointer static_cast is exactly why it is a safer and better way to cast (than traditional alternatives that may call the behavior defined by the implementation or undefined behavior), I'm still not sure what I should do for my situation.
I have a call to the OpenGL API function, whose signature
void glShaderSource( GLuint shader, GLsizei count, const GLchar **string, const GLint *length );
I recently changed the file reader API, so instead of returning the data read from the file as char * , I am doing it now with unsigned char * . This change was not made by mistake, because I feel that unsigned char is a much better descriptor for raw data (even if it could be ASCII data), and indeed void * can be even more understandable in this regard.
And then, of course, I will pass the address of this pointer as the third argument to glShaderSource .
I find it safe for me to just do a C-style GLchar** to GLchar** , and in fact this is probably the standard answer for this situation. Using reinterpret_cast will only be higher and higher, but admittedly only a small amount.
But I would like to know a little more about how the process should think in this situation. Why exactly can I fire the signature of the characters present here? Just because I don't expect to ever write a shader that has a high bit set to any of its characters, which means that I can use it?
What should I do if I am faced with a signed / unsigned integer situation and are the terrible consequences of false negative integers really interpreted as large positive values? How can I write code here to try to be โsafeโ about it?
My instinct tells me that this is obviously impossible without implementing the code that actually goes, and watching the data itself, and not passing the pointer, so in this situation there is simply no way to restore the static_cast security, since I "I have to work with pointers.