Recently, I was interviewed in a recent interview with string manipulation and asked to optimize performance. I had to use an iterator to move between TCHAR characters (with UNICODE support - 2 bytes each).
Without thinking about the length of the array, I made a curatorial error, not using size_t, but int for iteration. I understand that it is not compliant and not protected.
int i, size = _tcslen(str);
for(i=0; i<size; i++){
}
But the maximum memory that we can allocate is limited. And if there is a relationship between int and register sizes, it may be safe to use an integer.
For example: without any virtual matching tools, we can only display 2 bytes of register size. Since TCHAR is 2 bytes long, half that number. For any system that has an int as 32-bit, this will not be a problem, even if you are not using an unsigned version of int. People with a built-in background are used to counting int as 16 bits, but memory size will be limited on such a device. So I wonder if there is an architectural solution to fine-tune between integers and register sizes.
source
share