Consider integer division
a = bq + r
where a, b, q, r - respectively: dividend, divider, coefficient and remainder. In particular, when b = 0, there is no unique q satisfying the equation for a given a, and therefore it makes sense that in this case the factor q must be undefined.
However, in this case there really exists such r, namely r = a. Assuming that the factor and the remainder are always determined together, it follows that r is not determined when q is undefined, but when programming we often want to use the remainder % operation regardless of the division of / , I really came across a situation where I want if b == 0 then a else a % b end .
Is there / is any operator in any programming language, so that it matches % , but returns a dividend instead of a zero division error when the divisor is 0?
Is there a reason why most (or all) programming languages โโreturn a zero division error for % 0 ?
source share