Can the exceptions used as a control flow mechanism be valid in some specific scenarios?

I am customizing the code in the implementation of RationalNumber . In particular, within the logic of equality, I consider the following:

 public bool Equals(RationalNumber other) { if (RationalNumber.IsInfinity(this) || RationalNumber.IsInfinity(other) || RationalNumber.IsNaN(this) || RationalNumber.IsNaN(other)) { return false; } try { checked { return this.numerator * other.Denominator == this.Denominator * other.numerator; } } catch (OverflowException) { var thisReduced = RationalNumber.GetReducedForm(this); var otherReduced = RationalNumber.GetReducedForm(other); return (thisReduced.numerator == otherReduced.numerator) && (thisReduced.Denominator == otherReduced.Denominator); } } 

As you can see, I use exceptions as a flow control mechanism. The reason for this is that I do not want to punish the estimate of the greatest common divisor of both fractions at each equality test. Thus, I decide to do this only in the least probable case: overflow of one or both cross products.

Is this an acceptable practice? I have always read that exceptions should never be used as a flow mechanism for your code, but I see no other way to achieve what I want.

Any alternative approaches are welcome.

+6
source share
3 answers

The rationale for this is that I do not want to be responsible for evaluating the greatest common divisor of both fractions for each equality test.

This is a reasonable argument. The total cost of this code

 {probability of fast-path} * {fast-path cost} + ((1.0 - {probability of fast-path}) * {slow-path cost}) 

Depending on the three related constants, this will be a good or bad choice. You should be well aware that the data will be processed in practice.

Please note that exceptions are very slow. I once compared them at 10,000 per second per processor core, and I'm not sure that they will scale to multiple cores due to internal CLR locks.

Perhaps you can add profiling at runtime. Track the rate of exceptions. If too high, turn off optimization.

You should probably document why you did this.

This is also not an architectural problem, because if you change your mind, you can easily switch to another algorithm.

Alternatively, you can first calculate and compare unverified ones. If the result is not equal, it is guaranteed that the exact result is not equal. Even if an overflow has occurred. Thus, this may be an exception to the free fast track if many numbers are not equal.

+2
source

Usually catching exceptions has a large overhead, and you should catch exceptions if you can do something with them.

In your case, you can do something about the exception. In my opinion, using it as a control flow is not a problem, but I suggest you implement the logic (check different conditions to prevent exceptions), then compare both parameters and compare performance, since usually catching exceptions have high overhead, but if checking an exception exceptions takes longer, then handling the exception is the best way.

Update due to OP comment (its a new implementation, we do not use the Rational .NET framework. Numerator type and long denominator)

you can use larger types to avoid overflow exception like decimal or BigInteger

 decimal thisNumerator = this.numerator; decimal thisDenominator = this.numerator; decimal otherNumerator = other.numerator; decimal otherDenominator = other.numerator; checked { return thisNumerator * otherDenominator == thisDenominator * otherNumerator; } 

Update due to comments:

A simple example to display overhead with overhead.

 const int Iterations = 100000; var sw = new Stopwatch(); var sum1 = 0; sw.Start(); for (int i = 0; i < Iterations; i++) { try { var s = int.Parse("s" + i); sum1 += s; } catch (Exception) { } } sw.Stop(); Console.WriteLine(sw.ElapsedMilliseconds); Console.WriteLine(sum1); var sw2 = new Stopwatch(); var sum2 = 0; sw2.Start(); for (int i = 0; i < Iterations; i++) { try { int s; if (int.TryParse("s" + i, out s)) sum2 += s; } catch (Exception) { } } sw2.Stop(); Console.WriteLine(sw2.ElapsedMilliseconds); Console.WriteLine(sum2); 

result: exception handling is at least 170 times slower

5123
0
thirty
0

+1
source

This approach is introduced on MSDN. https://msdn.microsoft.com/en-Us/library/74b4xzyw.aspx

But the catching exception is high overhead, because the process mode at this time can change the user mode in kernel mode.

0
source

Source: https://habr.com/ru/post/989812/


All Articles