My friend was looking at this open source SSL code in the SHA encryption processing functions and noticed this odd piece:
ctx->total[0] += (uint32_t) ilen; // ilen is of type size_t ctx->total[0] &= 0xFFFFFFFF; if( ctx->total[0] < (uint32_t) ilen ) ctx->total[1]++;
We cannot understand two things about this code. Firstly, it is ANDs ctx->total[0] (like uint32_t ) with 0xFFFFFFFF , which should do nothing. In binary format, this ANDing with all 1 s, which should give the same value. In my opinion, these two lines are identical:
ctx->total[0] &= 0xFFFFFFFF; ctx->total[0] = ctx->total[0];
If I am right, why this line? Some reason for security? If I am wrong, how and why?
Secondly, we do not understand when this if ever will be true, if we assume that And does nothing. If And does nothing, then if essentially:
if (ctx->total[0] < ctx->total[0])
which should never be true. What are we missing?
If you want to see the header file to make sure ctx->total[0] is of type uint32_t or for some other reason, you can find it here.
Also, my first wild guess is that something is hiding there when we drop ilen from size_t to uint32_t , but I'm still stuck and confused.
source share