I was looking for a way to determine the scale and accuracy of the decimal in C #, which led me to several SO questions, but none of them seem to have the correct answers or the headings are misleading (they really refer to SQL server or some other db, not C #), or any answers at all. The following post, I think, is closest to what I need, but even this seems wrong:
Determine the decimal precision of the input number
First, there seems to be some confusion about the difference between scale and accuracy. For Google (for MSDN):
"Accuracy is the number of digits in a number. Scaling is the number of digits to the right of the decimal point in a number."
With that said, the number 12345.67890M will have a scale of 5 and an accuracy of 10. I have not found a single example of code that would accurately calculate this in C #.
I want to make two helper methods decimal.Scale() and decimal.Precision() so that the next unit test passes:
[TestMethod] public void ScaleAndPrecisionTest() {
... but I still need to find a fragment that will do this, although several people suggested using decimal.GetBits() , while others said they convert it to a string and parse it.
Converting it to a string and parsing is, in my opinion, a terrible idea, even ignoring the problem of localization with a decimal point. However, the math behind the GetBits() method is like me in Greek.
Can someone describe what the calculations will look like to determine the scale and accuracy in the decimal value for C #?