Why are there both utf-16le and utf-16be? efficiency - C

I was wondering why utf-16le and utf-16be exist? Is it considered to be "inefficient" for a large-end environment for processing junior data?

This is currently what I use when saving 2 bytes of var locally:

  unsigned char octets[2];
  short int shotint = 12345; /* (assuming short int = 2 bytes) */
  octets[0] = (shortint) & 255;
  octets[1] = (shortint >> 8) & 255);

I know that when storing and reading as a fixed consistency locally - there is no risk to end users. I was wondering if this was considered "ineffective"? What would be the most โ€œefficientโ€ way to store 2 bytes of var? (restricting data only to environmental content, local use only.)

Thanks Doori Bar

+3
source share
1 answer

Unicode . . , . , 66% , 33% .

, , .

, .

, , , short int .

[EDIT] 16- 2 :

char octet[2];
short * prt = (short*)&octet[0];
*ptr = 12345;

, octet[0] 8 . , , .

; .

, , (.. octet[1],octet[0]), .

, 32- 16- :

char octet[4];
short * prt = (short*)&octet[0];
*ptr ++ = 12345;
*ptr ++ = 23456;

int * ptr32 = (int*)&octet[0];
int val = ((*ptr32 << 8) & 0xff00ff00) || (*ptr >> 8) & 0x00ff00ff);
+2

Source: https://habr.com/ru/post/1756742/


All Articles