Int to char casting

int i = 259; /* 03010000 in Little Endian ; 00000103 in Big Endian */ char c = (char)i; /* returns 03 in both Little and Big Endian?? */ 

On my computer, it assigns 03 char c , and I have Little Endian, but I don't know if the casting char will read the least significant byte or read the byte pointed to by the variable i .

+6
source share
4 answers

Endianness doesn't really change anything. It does not try to store one of the bytes (MSB, LSB, etc.).

  • If the char is unsigned, it will be wrapped. Assuming 8-bit char 259% 256 = 3
  • If char signed, the result is determined by the implementation. Thanks pmg : 6.3.1.3/3 in the C99 standard
+11
source

Since you drop from a larger integer type to a smaller one, it takes the least significant part, regardless of whether it is clear. If you instead specified casting, it would take a byte at an address that would depend on the judgment.

So c = (char)i assigns the least significant byte of c , but c = *((char *)(&i)) assigns the first byte at the address from i to c , which would be the same in systems with small rows only.

+6
source

If you want to test little / big endian, you can use union:

 int isBigEndian (void) { union foo { size_t i; char cp[sizeof(size_t)]; } u; ui = 1; return *u.cp != 1; } 

This works because in a small endian it will look 01 00 ... 00, but in a large endian it will be 00 ... 00 01 (... consists of zeros). Therefore, if the first byte is 0, the test returns true. Otherwise, it returns false. Beware, however, that there are also mixed end machines that store data in different ways (some can switch with precision, others just store data in different ways). PDP-11 saved the 32-bit int as two 16-bit words, except that the word order was canceled (for example, 0x01234567 - 4567 0123).

0
source

When casting from int (4 bytes) to char (1 byte), it will save the last 1 byte. For instance:

 int x = 0x3F1; // 0x3F1 = 0000 0011 1111 0001 char y = (char)x; // 1111 0001 --> -15 in decimal (with Two complement) char z = (unsigned char)x; // 1111 0001 --> 241 in decimal 
-1
source

Source: https://habr.com/ru/post/897773/


All Articles