Why are decimal and hexadecimal integer literals handled differently?

Reading Stanley Lippman's "C ++ Primer", I learned that by default decimal integer literals are signed (the smallest type is int , long or long long , in which the literal value matches), while octal and hexadecimal literals can be either signed or not sign (the smallest type is int , unsigned int , long , unsigned long , long long or unsigned long long , in which the literal value is suitable).

What is the reason for handling these literals in different ways?

Edit: I am trying to provide some context

 int main() { auto dec = 4294967295; auto hex = 0xFFFFFFFF; return 0; } 

Debugging the following code in Visual Studio shows that the dec type is unsigned long and that the hex type is unsigned int .
This is contrary to what I read, but still: both variables represent the same value, but have different types. It bothers me.

+5
source share
1 answer

C ++. 2011 has changed its promotion rules with C ++. 2003. This change is documented in the & sect; C.2.1 [diff.cpp03.lex]:

2.14.2
Edit : Type of integer literals
Rationale : Compatible with C99

Standard C, both C.1999 and C.2011, defines the transformations in section 6.4.4.1. (C ++. 2011 & sect; 2.14.2 essentially copies content from the C standard.)

The type of an integer constant is the first of the corresponding list in which its value can be represented.

enter image description here
enlarged image

The rationale for C.1999 gives the following explanation:

The C90 rule that the default type of a decimal integer constant is either int , long , or unsigned long , depending on which type is large enough to hold a value without overflowing, makes it easy to use constants. C99 choices: int , long and long long . C89 added the suffixes U and U to indicate unsigned numbers. C99 adds LL to indicate long long .

Unlike decimal constants, octal and hexadecimal constants, too large to be int , are typed as unsigned int if within the range of this type, since it is more likely that they represent bit patterns or masks that are usually best understood as unsigned, not "real" numbers .

+5
source

Source: https://habr.com/ru/post/1246902/


All Articles