((UINT8_MAX + 1) * 1024) may become 0, because UINT8_MAX + 1 usually 256, and 256 * 1024 is 0 modulo 2 16 . So, if sizeof(int) == 2 on your architecture, then you get 0.
In typical modern desktop architectures with GCC, sizeof(int) == 4 , and you won't get division by 0.
To fix this, replace 1024 with 1024UL . This will work because the unsigned long guaranteed to reach 4294967295. (Thanks to Pascal Cuoc for his explanation.)
source share