NSIntegerMax vs NSUIntegerMax

NSUInteger index = [self.objects indexOfObject:obj]; if (index == NSNotFound) { // Success! Note: NSNotFound internally uses NSIntegerMax } if (index == NSUIntegerMax) { // Fails! } 

Why? I am assuming to get the unsigned value as a result of indexOfObject. Therefore, of course, I assumed that if the object was not found, it would return NSUIntegerMax instead of NSIntegerMax. Is this a mistake, or is there a logical explanation for this behavior.

+6
source share
3 answers

Perhaps NSNotFound can be used in contexts that use NSInteger . It is also safer if someone declares the index as NSInteger instead of NSUInteger .

In the best case, we can say that it is odd, that NSNotFound is defined as NSIntegerMax , but this, of course, is not an error.

+7
source

Your assumption is simply incorrect. It returns NSNotFound , whatever it is now or will be in the future. And it can be easily different on different platforms (e.g. 32 vs 64 bit).

The reason for its existence is its use. Do not be fooled by using other constants that may be the same.

+1
source

An NSUInteger is unsigned ('U'), and NSInteger signed.

You can compare unsigned and signed integer values. Assuming that NSNotFound is actually equal to NSUIntegerMax is probably an error. In fact, it is defined as NSIntegerMax in NSObjCRuntime.h.

0
source

Source: https://habr.com/ru/post/892018/


All Articles