Why is it faster to analyze with double.Parse()
and use decimal
instead of a call decimal.Parse()
?
Given the following:
string stringNumber = "18.34";
double dResult = 0d;
decimal mResult = 0m;
for (int i = 0; i < 9999999; i++)
{
mResult = (decimal)double.Parse(stringNumber);
mResult = decimal.Parse(stringNumber);
}
Results in the following indicators in the profiler VS2017 (.NET framework v4.7);
Cumulative double.Parse()
and listing is 37.84% of CPU usage versus decimal.Parse()
46.93% of CPU usage. There are more differences than can easily be attributed to the difference in the size of the data type. Can someone explain?
The application in which it appeared on the profiler takes 10+ days, so this small difference corresponds to the runtime hours. It would be good to understand why. I see what decimal.Parse()
causes oleaut32.dll
, but ... wth?
source
share