Euclidean Algorithm Complexity

I have a question about the Euclidean algorithm for finding the greatest common divisors.

gcd (p, q), where p> q and q is an n-bit integer.

I am trying to perform time complexity analysis using an algorithm (n-bit input as above)

gcd(p,q) if (p == q) return q if (p < q) gcd(q,p) while (q != 0) temp = p % q p = q q = temp return p 

I already understand that the sum of two numbers u + v , where u and v mean the initial values ​​of p and q , decreases by at least 1/2 .

Now let m be the number of iterations for this algorithm. We want to find the smallest integer m such that (1/2)^m(u + v) <= 1

Here is my question. I get that the sum of two numbers at each iteration is limited by the upper value (1/2)^m(p + q) . But I do not understand why max m achieved when this value <= 1 .

The answer is O (n) for n-bit q , but this is where I get stuck.

Please, help!

+5
source share
1 answer

Suppose that p and q, where p> q. Now there are two cases:

1) p> = 2 * q: in this case p will be reduced to something less than q after mod, so the sum will be no more than 2/3 of what it was before.

2) q <p <2 * q: in this case, the operation mod will be similar to subtracting q from p, so the total amount will be no more than 2/3 of what it was before.

Therefore, at each step this amount will be equal to 2/3 of the last amount. Since your numbers are n bits, the sum of the sum is 2 ^ {n + 1}; so with the steps log 2 ^ {n + 1} (base 3/2), which are actually O (n), the sum will be 0.

+1
source

Source: https://habr.com/ru/post/1266051/


All Articles