Const vs #define (weird behavior)

I used to replace const with C # define, but in the example below it prints false .

#include <iostream> #define x 3e+38 using namespace std; int main() { float p = x; if (p==x) cout<<"true"<<endl; else cout<<"false"<<endl; return 0; } 

But if I replaced

 #define x 3e+38 

with

 const float x = 3e+38; 

It works fine, the question is why? (I know that there are several topics discussed for #define vs const, but this actually did not work out, kindly enlighten me)

+5
source share
2 answers

In C ++, literals have double precision. In the first examples, the number 3e + 38 is first converted to float in the initialization of the variable, and then returns to double precision in comparison. Conversions are not needed exact, therefore numbers can differ. In the second example, the numbers stay swimming all the time. To fix this, you can change p to double , write

 #define x 3e+38f 

(which defines a floating literal) or change the comparison to

 if (p == static_cast<float>(x)) 

which performs the same conversion as the initialization of variables, and makes the comparison in single precision.

Also, as noted, comparing floating point numbers with == usually not a good idea, since rounding errors give unexpected results, for example, x*y may differ from y*x .

+5
source

The number 3e + 38 doubles in magnitude.

Appointment

 float p = x; 

makes 3e + 38 lose its accuracy and, therefore, its value if it is stored in p .

This is why the comparison:

 if(p==x) 

leads to false because p has a different value than 3e + 38.

+1
source

Source: https://habr.com/ru/post/1240795/


All Articles