Problem with char C ++ signature

This is the code:

int main()
{

 char c = 0x00;
 //c |= 0x0A;
 c |= 0xA0;

 while(c != 0x00)
 {
  cout << (c & 1) << endl;
  c = (c >> 1); 
 }
}

Why does this code work when I do orwith 0X0Aand not with 0xA0, since the number 0xA0 is large to match the signed char, but why am I not allowed to set the bits for 0xA0?

When I print a loop, it never breaks, and it only prints? Why?

+3
source share
2 answers

This is due to the expansion of the sign when performing a shift to the right.

0xA0 <=>        10100000 binary (and is a negative number because the MSB is set to 1)
(0xA0 >> 1) <=> 11010000

replace char cwith unsigned char c or mask the MSB after the shift.

int main()
{

 char c = 0x00;
 //c |= 0x0A;
 c |= 0xA0;

 while(c != 0x00)
 {
  cout << (c & 1) << endl;
  c = (c >> 1) & 0x7f; 
 }
}

Note that charit is usually signed (range -128..127), but some c / C ++ compiler defines it charas unsigned(AIX xlC compiler is a known case).

+4
source

1 ( , , ).

0xA0 = 1010 0000 binary

, c & 1 - - 0. ...

c = (c >> 1)

... 1101 0000, 1110 1000, 1111 0111, 1111 1011, 1111 1101, 1111 1110, 1111 1111 - . , 1s c == 0.

, , ? ?

, 0 , , , , 1s.

+1

Source: https://habr.com/ru/post/1786330/


All Articles