One idea would be to simplify the repetition by introducing a new variable k such that 2 k = n. Then the recurrence relation is obtained
T (2 k ) = 2T (2 k / 2 )
If you then give S (k) = T (2 k ), you will get a repetition
S (k) = 2S (k / 2)
Please note that this is equivalent
S (k) = 2S (k / 2) + O (1)
Since 0 = O (1). Therefore, by the Master Theorem, we obtain that S (k) = & Theta; (k), since we have a = 2, b = 2 and d = 0 and log b a> d.
Since S (k) = & Theta; (k) and S (k) = T (2 k ) = T (n), we obtain that T (n) = & Theta; (k). Since we chose 2 k = n, this means that k = log n, therefore T (n) = & Theta; (log n). This means that your initial assumption O (log log n) is incorrect and that the execution time is only logarithmic, not double logarithmic. If only one recursive call were made, you would be right that the runtime would be O (log log n).
Hope this helps!
source share