Calculating Big O Notation with Recursion

I am trying to understand the worst execution time of Big O Notation. But I still do not quite understand.

This is the code I wrote recently:

def g(n): if n==0: return 1 elif n==1: return 2 else: n_div=n//2 n_mod=n%2 return g(n_div)*g(n_mod) 

Therefore, I hope that at least I am right:

 def g(n): if n==0: return 1 

and

 elif n==1: return 2 

- O (1), therefore they are constant.

But what about the else part.

Is it O (n) because it depends on n , which one do I choose?

Can someone explain what Big O complexity is in the else part?

+5
source share
2 answers

Well, you noticed that you can divide your function into 3 separate cases and already determined that the first 2 are O (1). The third is a bit more complicated, but you can split it into two parts.

Recursion obviously comes from:

 g(n//2)*g(n%2) 

We can immediately see that n%2 will evaluate either 0 or 1, which again will resolve one of the first 2 cases, so we can ignore this. Leaving us with g(n//2) . Rewriting this as a print loop and looking at the output, you will notice something:

 >>> n = 37 >>> while n > 1: ... n //= 2 ... print(n) ... 18 9 4 2 1 

As you can see, the terms are halved every time, and this happens in recursion. This is logarithmic.

As a result, the worst case for this function is O (logn).

You can learn more about what logn really means in Big-O notation by looking at this question .

+2
source

The designation O is not really part of the program. It really measures how the runtime increases with increasing input size. In this case, the time-consuming part is the last part.

You need to find out how it works for your program. Here is a simple empirical analysis that should help you. If we measure the program a bit to print how many times the final part of else run for a given input, we get this.

  n | times -----+----- 2 | 1 4 | 2 8 | 3 16 | 4 32 | 5 64 | 6 128 | 7 256 | 8 512 | 9 1024 | 10 2048 | 11 4096 | 12 8192 | 13 

If you build it, you will see something like this.

enter image description here

You can see that as the input size increases, the number of calls also increases, but sublinearly. In fact, it is logarithmic, since your n reduces by 50% each iteration of the loop.

0
source

Source: https://habr.com/ru/post/1273539/


All Articles