BOOL Error from NSMethodSignature

I came across really strange behavior when using the [NSMethodSignature getArgumentTypeAtIndex] function. It returns me the β€œ@” character for the BOOL type, which is erroneous according to the objective-c encoding type . If I use the objc \ runtime.h library method, method_getTypeEncoding method The BOOL type is correctly represented as "B", however I do not understand why it does not work with the higher level NSMethodSignature. The following code demonstrates the problem:

-(void)viewDidAppear:(BOOL)animated { [super viewDidAppear:animated]; NSInvocation* inv = [NSInvocation invocationWithMethodSignature:[self methodSignatureForSelector:@selector(viewDidAppear:)]]; const char* encFromGetArgument = [[inv methodSignature] getArgumentTypeAtIndex:2]; const char* encFromMethodSignature = method_getTypeEncoding(class_getInstanceMethod([self class], @selector(viewDidAppear:)));; const char* methodEncodingPure = [[[[NSString stringWithUTF8String:encFromMethodSignature] componentsSeparatedByCharactersInSet:[NSCharacterSet decimalDigitCharacterSet]] componentsJoinedByString:@""] UTF8String];//remove stack sizes NSLog(@"BOOL arg from NSMethodSignature: %s", encFromGetArgument); NSLog(@"BOOL arg from objc/runtime.h: %c", methodEncodingPure[3]);//first type is for return, second is target, third is selector } 

The above unexpectedly (at least for men) prints the following:

BOOL arg of NSMethodSignature: @

BOOL arg of objc / runtime.h: B

I am currently using my own implementation to avoid this odd behavior, however I want to know if I am missing something or is it just a mistake. My only hint is that BOOL is primitive, and therefore it cannot be used directly when calling objective-c methods. However, when I try to check it, [object isKingOfClass: [NSNumber class]] returns NO.

UPDATE

Good. I updated Xcode to the latest version (6.1 6A1052d) and the situation has improved significantly. However, now my problem is to distinguish unsigned char encoding from real bool encoding. I know that in older versions of BOOL, typedef is like char, but how can I do real char vs BOOL encoding? Now my results:

For Simulator iPhone6 ​​and real iPhone6 device I received:

 argument 2: -------- -------- -------- -------- type encoding (B) 'B' flags {} BOOL arg from NSMethodSignature: B BOOL arg from objc/runtime.h: B 

which is awesome however for the iPhone4s simulator and the real iPhone5 device I get:

 argument 2: -------- -------- -------- -------- type encoding (c) 'c' flags {isSigned} BOOL arg from NSMethodSignature: c BOOL arg from objc/runtime.h: c 

I am almost sure that if I check iPhone5, it will get the same result as iPhone6 ​​(as I think, all about 64-bit architecture). So now my question is how to bypass old devices correctly, how to distinguish char from BOOL for them? Or should I just assume that if the encoding is "c" and the argument is "1", we have YES, but for "0" NO?

+5
source share
3 answers
 char *buf1 = @encode(BOOL); NSLog(@"bool type is: %s", buf1); 

In a 32-bit simulator, encode (BOOL) returns 'c', and on a 64-bit simulator, returns 'B'. In the build settings, change the Architecture to $(ARCHS_STANDARD_32_BIT) , then it will return you "c".

in the objc / objc.h file.

 #define OBJC_BOOL_DEFINED /// Type to represent a boolean value. #if !defined(OBJC_HIDE_64) && TARGET_OS_IPHONE && __LP64__ typedef bool BOOL; #else typedef signed char BOOL; // BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C" // even if -funsigned-char is used. #endif 

BOOL in 32bit is a type of signed char. You can distinguish it from unsigned char.

 @encode(char) --> 'c' @encode(unsigned char) --> 'C' 

You can say that the device is on a 32-bit or 64-bit basis from here , and if it is in 32-bit, then you can use 'c' to check for BOOL.

+5
source

In the debugger, I got the following: po [inv methodSignature]:

(lldb) po [inv methodSignature]

 <NSMethodSignature: 0x7be7f660> number of arguments = 3 frame size = 12 is special struct return? NO return value: -------- -------- -------- -------- type encoding (v) 'v' flags {} modifiers {} frame {offset = 0, offset adjust = 0, size = 0, size adjust = 0} memory {offset = 0, size = 0} argument 0: -------- -------- -------- -------- type encoding (@) '@' flags {isObject} modifiers {} frame {offset = 0, offset adjust = 0, size = 4, size adjust = 0} memory {offset = 0, size = 4} argument 1: -------- -------- -------- -------- type encoding (:) ':' flags {} modifiers {} frame {offset = 4, offset adjust = 0, size = 4, size adjust = 0} memory {offset = 0, size = 4} argument 2: -------- -------- -------- -------- type encoding (c) 'c' flags {isSigned} modifiers {} frame {offset = 8, offset adjust = 0, size = 4, size adjust = -3} memory {offset = 0, size = 1} 

Where I printed with NSLog

BOOL arg of NSMethodSignature: c

I could not write a comment, so I wrote this answer. Could you take care of your [inv methodSignature] ?! And can you send your NSLog right after it is assigned encFromGetArgument

+2
source

Based on @gabbler's answer, I managed to check if the argument is bool or not. The following code works for both 32 and 64-bit architecture:

 -(void)viewDidAppear:(BOOL)animated { [super viewDidAppear:animated]; NSInvocation* inv = [NSInvocation invocationWithMethodSignature:[self methodSignatureForSelector:@selector(viewDidAppear:)]]; const char* encFromGetArgument = [[inv methodSignature] getArgumentTypeAtIndex:2]; if( 0 == strcmp(@encode(BOOL), encFromGetArgument) ) { //BOOL val NSLog(@"arg is bool"); } else { //not BOOL NSLog(@"arg is not bool"); } } 
+2
source

Source: https://habr.com/ru/post/1205070/


All Articles