We get a compile-time error when the integer is divided by zero, while in the case of double there is no compilation error, but at runtime we get the infinity / NaN as a result. Any idea why int and double have different behavior when divided by zero exception?
void Main() { int number = 20; var result1 = number/0; // Divide by zero compile time exception double doubleNumber = 20; var result2 = doubleNumber/0.0; // no compile time error. Result is infinity or NaN }
source share