IOS binary method error

I have a framework code that does this:

#ifdef USE_DOUBLE typedef double real; #else typedef float real; #endif 

But when I try to use this type when USE_DOUBLE defined (like real == double) like this:

 - (id) initSomeObject:(real)arg andSomeOtherStuff:(id)thing { self = [super init]; if (self) { field = arg; } } [someObject initSomeObject:2.0 andSomeOtherStuff:nil]; 

The arg value is completely destroyed, appearing as 5.3 ... e-315. However , if I turn off USE_DOUBLE off, a value of 2.0 works fine. This is the latest version of iOS 5+ using Xcode 4.3.3 and LLVM 3.1 on iPhone 4. Can't iOS handle duplication? Should my USE_DOUBLE flag be undef'd?

+4
source share
1 answer

Define USE_DOUBLE and type real defined in the static library that my application uses. It turns out that USE_DOUBLE should be set to the same value in the framework and in the client application for typedef to work correctly. It looks like my application was considered a separate compilation unit, that is, because it did not have USE_DOUBLE , real had a float. This caused the runtime to narrow down the accuracy of my values ​​and cause unexpected behavior. Adding #define USE_DOUBLE to my application project seems to have solved this problem.

0
source

Source: https://habr.com/ru/post/1432436/


All Articles