Precision and scale of decimal multiplication in (My) SQL

Let's say I have two numbers num1(p1,s1) and num2(p2, s2) (where p and s are precision and scale, respectively).

What is the rule for calculating the accuracy and scale of a result num1 times num2 ?

Intuitively, the scale will be the sum of s1 and s2 (because this is how you do multiplications in 1st grade!), But I can’t find the rules for determining accuracy.

Thanks.

+4
source share
1 answer

I assume that you declared two numbers something like this: the syntax in your question is not SQL:

 CREATE TABLE decimals (num1 DECIMAL(p1,s1), num2 DECIMAL(p2,s2)); 

Accuracy is simply the number of digits in which the number is stored, and scale is the number of them after the decimal point.

So, you are right that the scale as a result of multiplication will be s1 + s2 , but the maximum accuracy will also be equal to the sum p1 + p2 . Actual accuracy may be less than depending on the number that is being multiplied.

Example

Starting from numbers that can be saved as DECIMAL(2,1) :

 SELECT 9.9 * 9.9; 

gives 98.01 , which can be saved as DECIMAL(4,2) .

 SELECT 9.9 * 0.5; 

gives 4.95 , which can be saved as DECIMAL(3,2)

+1
source

Source: https://habr.com/ru/post/1435120/


All Articles