I have C code that bothers me:
int a = 1;
int b = 32;
printf("%d\n %d\n", a<<b, 1<<32);
Output signal
1
0
The code was running on Ubuntu 16.04 (Xenial Xerus), and I compiled it using gcc -m32 a.c
GCC version 5.4.0.
I read several posts that explained why a<<b
1 outputs, but I don’t understand why the results 1<<32
are 0. I mean, what is the difference between a<<b
and 1<<32
?
source
share