The behavior you see is a consequence of the fact that the Windows printf() function is implemented differently than the Linux printf() function. Most likely, the difference is how printf() implements rounding of numbers.
How printf() works under the hood in any system is an implementation detail; thus, the system is unlikely to provide such small-scale control over how printf() displays floating point values.
There are two ways to keep them the same:
Use more precision during the calculation than when displaying it. For example, some scientific and graphing calculators use double precision for all internal calculations, but only display results with float precision.
Use the cross-platform library printf() . Such libraries are likely to have the same behavior on all platforms, since the calculations needed to determine which numbers to display are usually platform agnostic.
However, this is really not such a big problem as you think. The difference between the outputs is 0.000001. This is minus 0.0000000004% of both values. The display error is very slight.
Consider this: the distance between Los Angeles and New York is 2464 miles , which is in the same order of magnitude as the numbers in your display outputs. The difference of 0.000001 miles is 1.61 millimeters. Of course, we do not measure the distance between cities with such accuracy. :-)
source share