An array doubles in size if the structure defines both its words uint16_t and uint8_t bytes

I have an array, each of which elements can be either uint16_t, or a pair of uint8_t.

Its elements are defined as a union of uint16_t and a submatrix of 2 uint8_t.

Unfortunately, the compiler (MicroChip XC16) allocates twice as much memory as it should for the array.

typedef union { uint16_t u16; // As uint16_t uint8_t u8[2]; // As uint8_t } my_array_t; my_array_t my_array[1]; // 1 word array, for testing my_array[0].u8[0] = 1; my_array[0].u8[1] = 2; uint8_t byte_0 = my_array[0].u8[0]; // Gets 0x01 uint8_t byte_1 = my_array[0].u8[1]; // Gets 0x02 uint16_t byte_0 = my_array[0].u16; // Gets 0x0201 

The compiler allocates 4 bytes instead of 2 bytes, as it should be.

Workaround: if I changed the structure to:

 typedef union { uint16_t u16; // As uint16_t uint8_t u8[1]; // As uint8_t } my_array_t; 

The compiler allocates 2 bytes properly, but then this is not true:

 my_array[0].u8[1] = 2; 

although it still works:

 uint8_t byte_1 = my_array[0].u8[1]; // Gets 0x02 

(except for the inconvenience that the debugger does not show its value).

Question: should I live with a workaround or use the best solution?

Refer to the previous discussion for the above solution.


EDIT.

At the suggestion of EOF (below), I checked sizeof.

To workaround:

 sizeof(my_array_t) // Is 4 sizeof(my_array[0]) // Is 4 sizeof(my_array[0].u8) // Is 2 

After the workaround:

 sizeof(my_array_t) // Is 2 sizeof(my_array[0]) // Is 2 sizeof(my_array[0].u8) // Is 2 

This would mean that it is a compiler error.

+6
source share
2 answers

Instead of an array of 2 bytes, use a structure of 2 bytes:

 // Two bytes in a 16-bit word typedef struct{ uint8_t lsb; // As uint8_t, LSB uint8_t msb; // As uint8_t. MSB } two_bytes_t; typedef union { uint16_t u16; // As uint16_t two_bytes_t u8x2; // As 2 each of uint8_t } my_array_t; my_array_t my_array[1]; // 1 word array, for testing my_array[0].u8x2.msb = 1; my_array[0].u8x2.lsb = 2; 

The XC16 compiler correctly allocates only 2 bytes for each element, and the debugger correctly displays individual bytes.

+5
source

It looks like this problem has been fixed in the compiler. I tested it in XC16 1.26 and got these results from my code (optimization 0):

 #include "mcc_generated_files/mcc.h" #include "stddef.h" typedef union { uint16_t u16; uint8_t u8[2]; } example_1_t; typedef union { uint16_t u16; struct { uint8_t lsb; uint8_t msb; }; } example_2_t; int main(void) { SYSTEM_Initialize(); size_t typeSize1 = sizeof (example_1_t); // debugger shows 2 size_t typeSize2 = sizeof (example_2_t); // debugger shows 2 example_1_t ex1; // Can see all values in debugger ex1.u16 = 0x4321; // u8[0] = 0x21, u8[1] = 0x43 example_2_t ex2; // Can see all values in debugger ex2.u16 = 0x4321; // lsb = 0x21, msb = 0x43 size_t objSize1 = sizeof (ex1); // debugger shows 2 size_t objSize2 = sizeof (ex2); // debugger shows 2 while (1) { } return -1; } 
+1
source

Source: https://habr.com/ru/post/978345/


All Articles