In C # (int) decimal and Convert.Int32 (decimal) give me two different results

I understand that there are rounding errors, but can someone explain why I get such different results using these different methods:

decimal amount = 9.990M; var cost = Convert.ToInt32(amount*1000); var cost1 = (int) amount*1000; 

I get:

 cost = 9990 cost1 = 9000 
+4
source share
3 answers

The second should be

 var cost1 = (int)(amount * 1000); 

You need to multiply 1000 and then convert the result. In your example, you first convert and then multiply.

See Priority and Associativity of Operators

+3
source

Try (int)(amount*1000) . In Convert brackets ensure priority is respected, but cast (int) takes precedence over multiplication - so you have: ((int)amount)*1000 , which rounds (during translation) to 9.

In particular, see "7.2.1 Operator Priority and Associativity" in the MS specification, which defines the throw before multiplication:

  • 7.5: Primary: xy f (x) a [x] x ++ x-- new default type checked unchecked delegate
  • 7.6: Unary: + -! ~ ++ x --x (T) x
  • 7.7: Multiplicative: * /%
  • etc.
+15
source

I wonder if there is a priority problem? Try the following:

 (int)(amount*1000); 
+2
source

Source: https://habr.com/ru/post/1300736/


All Articles