I have the following code:
decimal? a = 2m; decimal? b = 2m; decimal c = a ?? 1m * b ?? 1m;
Since both a and b were filled, I expect c give me a result of 4 .
However, the result is 2 , and in this case b taken as 1 instead of 2 .
Does anyone know what is the reason for this behavior?
source share