Why does 3.14 + 1.00 = 4.140000000000001 give more decimal value in javascript?

<!DOCTYPE html>
<html>
<body>
<script>
var a=3.14+1.00;
document.write(a + "<br>");
</script>
</body>
</html>

Output: 4.140000000000001

Every time something adds to 3.14, it is either 1, 2, or 1.00 it gives more than a decimal value ending in "1".

+4
source share

Source: https://habr.com/ru/post/1527323/


All Articles