Much has been said on this topic, but I could not find the exact answer to my question.
JavaScript cannot accurately represent decimal numbers, such as 0.1, and this is understandable.
For example, this is true due to rounding errors that occur during animation:
This is good - all in accordance with the IEEE Standard for Floating-Point Arithmetic (IEEE 754).
I don’t understand why other languages that also use the standard give more accurate measurements.
Is it because of the different rounding rules that they use? https://en.wikipedia.org/wiki/IEEE_floating_point#Rounding_rules or am I missing something?