C # and Java: 3/2 * 3.2 = 3.2, why?

Follow C # and Java,

double d = 3 / 2 * 3.2;

Java

System.out.println(d); // 3.2

FROM#

Console.WriteLine(d); //3.2

Skip 3/2,

We know that the correct answer should be 4.8

if i change to

double d = 3.00 / 2 * 3.2;

I can get 4.8,

So, I want to ask if (3/2 * 3.2) is illegal, why eclipse and vs2008 have no error? And how to prevent this problem in both C # and Java?

+3
source share
8 answers

3 / 2is considered as integer division, so the result is obtained 1.

Then doing the multiplication between 1and 3.2leads to the advancement of the integer 1to the floating point 1, which leads to 3.2.

The idea is this:

// Both "3" and "2" are integers, so integer division is performed.
3 / 2 == 1

// "3.2" is a floating point value, while "1" is an integer, so "1" is
// promoted to an floating point value.
1 * 3.2  -->  1.0 * 3.2 == 3.2

2.0 Java ( a double), 4.8, . .

, .

:

+11

3/2 - , 1.

1 * 3.2 3.2, , .

, . 3.00, .

Thorbjørn : - , 1.0*, 1.0*3/2*3.2.

+25

, 3/2 - ( , , 0 ( - )). - "" , .

,

3.0 / 2 * 3.2          or      3 / 2.0 * 3.2
3.0 / 2.0 * 3.2
3d / 2d * 3.2 (C#)
(double)3 / 2 * 3.2
+7

, ,

 1.0 * 3 / 2 * 3.2

, /:

3d / 2 * 3.2
+6

: 3/2 1; 1.5 ( ). 1 * 3.2 - , ; , 1.0.

: , .

+3

3 / 2 - .

3 2 . int , int ( 1).

, 3 / 2 * 3.2 1 * 3.2. int a float, float. , 3.2.

+1

.

# ( Java, ) 3 2 , '3/2' , 1.

3.2 - , , 1 * 3.2 - 3.2, .

There is no code error in this, so Java and C # accept this.

The safest way has always been to tell the compiler that you need floating point values ​​using

3.0 / 2.0 * 3.2

This works in all cases.

+1
source

In c #

3d / 2d * 3.2

As mentioned above

0
source

Source: https://habr.com/ru/post/1717005/


All Articles