C # Currency Formatting

I see an intriguing situation, rounding the currency in C # (VS 2008 SP1). Below is an image of test cases:

alt text http://img697.imageshack.us/img697/8500/testcases.png

I expected five, six, and seven cases (my bad, not numbering them in the output) to round the number to the penny.

Here is my test code:

static void Main(string[] args)
{
    decimal one = 10.994m;
    decimal two = 10.995m;
    decimal three = 1.009m;
    decimal four = 0.0044m;
    decimal five = 0.0045m;
    decimal six = 0.0046m;
    decimal seven = 0.0049m;
    decimal eight = 0.0050m;

    Console.WriteLine(one + ": " + one.ToString("C"));
    Console.WriteLine(two + ": " + two.ToString("C"));
    Console.WriteLine(three + ": " + three.ToString("C"));
    Console.WriteLine(four + ": " + four.ToString("C"));
    Console.WriteLine(five + ": " + five.ToString("C"));
    Console.WriteLine(six + ": " + six.ToString("C"));
    Console.WriteLine(seven + ": " + seven.ToString("C"));
    Console.WriteLine(eight + ": " + eight.ToString("C"));

    Console.ReadLine();
}

When I reflected in .ToString (string format) to find out what was going on, I found

public string ToString(string format)
{
    return Number.FormatDecimal(this, format, NumberFormatInfo.CurrentInfo);
}

who has a call

[MethodImpl(MethodImplOptions.InternalCall)]
public static extern string FormatDecimal(
    decimal value, 
    string format, 
    NumberFormatInfo info);

Is there any logic in this call that says the graininess for my current culture settings for NumberFormatInfo is two decimal places for currencty, so don't let ten thousandth places drill a number because it's insignificant?

How is this method implemented? Are we shifting the earth in bits or is something else happening?

Thanks for any ideas.

+3

Source: https://habr.com/ru/post/1740069/


All Articles