Iโve been learning a Mac program in the last few months (I have experience working in other languages). Obviously, this meant learning Objective-C, and therefore the simpler C on which it is based. So I come across this quote, which refers to the C / C ++ language in general, and not just to the Mac platform.
With C and C ++, it is preferable to use int over char and short. The main reason is that C and C ++ perform arithmetic operations and the parameter going to the integer level, if you have an integer value that can be a byte, you should still use int to hold the number. If you use a char, the compiler will first convert the values โโto an integer, perform operations, and then convert the result to char.
So my question is, is this true on Mac Desktop and IPhone OS? I understand that speaking of abstract environments, we are actually talking about 3-4 different architectures (PPC, i386, Arm and A4 Arm versions), so there can be no answer.
However, the general principle is that in modern 32-bit / 64-bit systems using 1-2 byte variables that are not consistent with the physical 4-byte machine words, does not provide much of the efficiency that we can expect.For example, a simple old C-array with a size of 100,000 characters is four times smaller than the same 100,000 ints, but if during enumeration, reading each index involves cast / boxing / unboxing sorting, we will see the general โperformanceโ "despite saved memory overhead?
source
share