In accordance with all ISO C standards, all sizes are measured in multiple char sizes. This means, by definition, sizeof(char) == 1 . The char bit size in bits is determined by the CHAR_BIT macro in <limits.h> (or <climits> ). The minimum char size is 8 bits.
Additional type restrictions:
sizeof(char) <= sizeof(short int) <= sizeof(int) <= sizeof(long int)
int should be able to represent from -32767 to +32767 - for example. It must be at least 16 bits.
C99 added a long long int , the size of which is greater than or equal to long int .
The rest is implementation dependent. This means that the C compiler in question chooses how large these numbers are.
How do C compilers choose these sizes?
There are some general conventions that most compilers adhere to. long often chosen as large as a machine word. On 64-bit machines, where CHAR_BIT == 8 (this is almost always the case, so I assume for the rest of this answer) this means sizeof(long) == 8 . On 32-bit machines, this means sizeof(long) == 4 .
int almost always 32 bits wide.
long long int often has a width of 64 bits.
source share