int main(void) { unsigned i = -23; // line 1 short a, b = 0x1; printf("sizeof(short) = %i\n", (int)sizeof(short)); // line 2 a = b << 31; // line 3 printf("i = %i", i); // line 4 printf("i = %u", i); // line 5 return 0; }
Why doesn't line 1 give any errors when specifying an unsigned type?
Line 2 prints sizeof(short)
on my system as 2 bytes. a and b are short integers, therefore, 2 bytes or 16 bits in length. But line 3 does not cause errors. How is a left bit offset of 31 bits possible when the word length is only 16 bits?
Is there any implicit conversion on lines 4 and 5?
I am using the GCC compiler on a 64-bit Mac.
source share