Why is GLsizei not defined as unsigned?

I searched for typedef GLsizei to implement OpenGL ES 1.1 on iOS and was surprised to find that it was defined as int . Some quick searches showed that this was normal. (Including normal OpenGL.)

I expected it to be defined as unsigned int or size_t . Why is it defined as just vanilla int ?

+4
source share
1 answer

It seems unlikely that this will be a problem if you don't have 4 GB data structures.

Here someone answers: http://oss.sgi.com/archives/ogl-sample/2005-07/msg00003.html

 Quote: (1) Arithmetic on unsigned values in C doesn't always yield intuitively correct results (eg width1-width2 is positive when width1<width2). Compilers offer varying degrees of diagnosis when unsigned ints appear to be misused. Making sizei a signed type eliminates many sources of semantic error and some irrelevant diagnostics from the compilers. (At the cost of reducing the range of sizei, of course, but for the places sizei is used that rarely a problem.) (2) Some languages that support OpenGL bindings lack (lacked? not sure about present versions of Fortran) unsigned types, so by sticking to signed types as much as possible there would be fewer problems using OpenGL in those languages. 

Both explanations seem plausible - I came across several times 1) when I stupidly used NSUInteger for a loop counter (hint: do not do this, especially when counting down to zero).

+4
source

Source: https://habr.com/ru/post/973207/


All Articles