Due to recursion, divide () is called up to n times.
Suppose simple arithmetic on n-bit integers takes O (n) time. (This is true in all the large number implementations that I know of - in Python, for example, adding 1 to a large integer copies all of this.)
Then we have a finite number of O (n) operations that are executed up to n times. It takes O (n ^ n) time.
def divide(x, y):
assert y >= 1
if x == 0:
return 0, 0
q, r = divide(x // 2, y)
q *= 2
r *= 2
if x & 1:
r += 1
if r >= y:
r -= y
q += 1
return q, r
source
share