Can you explain this behavior of Math.Log10 and BigInteger.Log10?

Can someone explain the following behavior System.Numerics.BigInteger?

Console.WriteLine(Math.Log10(100));       // prints 2
Console.WriteLine(Math.Log10(1000));      // prints 3 (as expected)

Console.WriteLine((int)Math.Log10(100));  // prints 2
Console.WriteLine((int)Math.Log10(1000)); // prints 3 (as axpected)

var bi100 = new BigInteger(100);
var bi1000 = new BigInteger(1000);

Console.WriteLine(BigInteger.Log10(bi100));       // prints 2
Console.WriteLine(BigInteger.Log10(bi1000));      // prints 3 (as axpected) 

Console.WriteLine((int)BigInteger.Log10(bi100));  // prints 2
Console.WriteLine((int)BigInteger.Log10(bi1000)); // prints 2 ???????

Console.WriteLine(Math.Floor(BigInteger.Log10(bi100)));   // prints 2
Console.WriteLine(Math.Floor(BigInteger.Log10(bi1000)));  // prints 2 ???????

Console.WriteLine(Math.Round(BigInteger.Log10(bi100)));  // prints 2
Console.WriteLine(Math.Round(BigInteger.Log10(bi1000))); // prints 3 (as expected)

EDIT: Please note that I know this is a taxi problem. I want to know why the behavior of Math.Log10 and BigInteger.Log10 is different.

+3
source share
3 answers

This is due to accuracy and rounding.

This line:

Console.WriteLine((int)BigInteger.Log10(bi1000)); 

rounds the value 2.999999999999999696 to 2, whereas it Console.WriteLinewrites it as 3

You can check this with an intermediate variable doubleand check its value:

double x = BigInteger.Log10(bi1000);
Console.WriteLine((int)x);  
+6
source

, BigInteger.Log10(x) Math.Log(x)/Math.Log(10), Math.Log10(x) - ( a extern, ). , , base-10, .

+1

The behavior is different in that they are different types with different representations and different implementations.

-1
source

Source: https://habr.com/ru/post/1793322/


All Articles