Although you already accepted a different answer, I would say that using NULL is actually the best choice for several reasons.
The first reason is because aggregates return a “correct” answer (that is, one that users expect) when NULL, but give a “wrong” answer when you use zero. Consider the results of AVG () in these two queries:
-- with zero; gives 1.5 select SUM(measure), AVG(measure) from ( select 1.0 as 'measure' union all select 2.0 union all select 3.0 union all select 0 ) dt -- with null; gives 2 select SUM(measure), AVG(measure) from ( select 1.0 as 'measure' union all select 2.0 union all select 3.0 union all select null ) dt
If we assume that the measure here is the "number of days for the manufacture of the goods", and NULL represents the element that is still being produced, then zero gives the wrong answer. The same considerations apply to MIN () and MAX ().
The second problem is that if the value of zero is the default value, then how do you distinguish between zero as the default value and zero as the real value? For example, consider a measure of “delivery costs in euros,” where NULL means that the customer himself took the order so that there were no delivery fees, and zero means that the order was sent to the customer for free. You cannot use null to replace NULL without completely changing the data value. Obviously, you can argue that the difference should be clear from other dimensions (for example, the delivery method), but this adds more complexity to the reports and understands the data.
source share