I have a question about the Euclidean algorithm for finding the greatest common divisors.
gcd (p, q), where p> q and q is an n-bit integer.
I am trying to perform time complexity analysis using an algorithm (n-bit input as above)
gcd(p,q) if (p == q) return q if (p < q) gcd(q,p) while (q != 0) temp = p % q p = q q = temp return p
I already understand that the sum of two numbers u + v , where u and v mean the initial values ββof p and q , decreases by at least 1/2 .
Now let m be the number of iterations for this algorithm. We want to find the smallest integer m such that (1/2)^m(u + v) <= 1
Here is my question. I get that the sum of two numbers at each iteration is limited by the upper value (1/2)^m(p + q) . But I do not understand why max m achieved when this value <= 1 .
The answer is O (n) for n-bit q , but this is where I get stuck.
Please, help!
source share