C # unit of measure numerical accuracy questions

I am testing basic math functions that will return its mean / variance / standard deviation. The problem I am facing is that I cannot get the accuracy of the "expected value" for the math that the function returns. For example, if the deviation function returns 50.555555555556666, even if I set the expected value to explicitly 50.555555555555566, it will say that they have two different doublings, and the unit test fails.

The following is the actual output from unit test:

Error Assert.AreEqual. Expected: <50.555555555555556>. Actual :. & L; 50,555555555556666>

Can anyone advise on this? I use the built-in suite for testing visual studio. Thank.

+3
source share
1 answer

Floating point numbers (Single / Double) must be tested with a valid value. So you can tell if two numbers are within 0.0001 (tolerance) of each other, consider them equal

In NUnit, you are comparing. for example the following AreEqual overload, find the equivalent for MSTest ..

Assert.AreEqual( double expected, double actual, double tolerance,
                 string message );

Update: This should be the method you need in MSTest. Try to check if your problem fixes.

+8
source

Source: https://habr.com/ru/post/1771327/


All Articles