Is it really a thing like char or short in modern programming?

Iโ€™ve been learning a Mac program in the last few months (I have experience working in other languages). Obviously, this meant learning Objective-C, and therefore the simpler C on which it is based. So I come across this quote, which refers to the C / C ++ language in general, and not just to the Mac platform.

With C and C ++, it is preferable to use int over char and short. The main reason is that C and C ++ perform arithmetic operations and the parameter going to the integer level, if you have an integer value that can be a byte, you should still use int to hold the number. If you use a char, the compiler will first convert the values โ€‹โ€‹to an integer, perform operations, and then convert the result to char.

So my question is, is this true on Mac Desktop and IPhone OS? I understand that speaking of abstract environments, we are actually talking about 3-4 different architectures (PPC, i386, Arm and A4 Arm versions), so there can be no answer.

However, the general principle is that in modern 32-bit / 64-bit systems using 1-2 byte variables that are not consistent with the physical 4-byte machine words, does not provide much of the efficiency that we can expect.

For example, a simple old C-array with a size of 100,000 characters is four times smaller than the same 100,000 ints, but if during enumeration, reading each index involves cast / boxing / unboxing sorting, we will see the general โ€œperformanceโ€ "despite saved memory overhead?

+3
source share
5 answers

. ( int8_t int16_t). , .

+3

PPC/Arm/A4Arm, x86 , 8 , 16 32 (64 , x86_64 64 ), , . 32- , 16/24 , .

, , , ... .

+2

, . , , UTF-8, ASCII , , 64- ?

, , . .

, , . .

+1

x86 , - 8 .

.

, , 1, 2, 4 () 8 .

0

, , , , , .

C .

2008 . 98% , , .

0

Source: https://habr.com/ru/post/1743641/


All Articles