uint32_t b = 1 << 16;
as you noticed, this disappears unless you first drop 1 into a 32-bit integer:
A literal 1is the default integer type in your compiler. I don’t know which one, but it is either 8 or 16 bits of int.
Now suppose it's 16 bits. When you change 1 time 16 times, you just ... well, that doesn't make sense. So make your 132-bit int first, and then slide it.
source
share