Base case and time complexity in recursive algorithms

I would like some clarification regarding O (N) functions. I am using SICP .

Consider a factorial function in a book that generates a recursive process in pseudo-code:

function factorial1(n) {
    if (n == 1) {
        return 1;
    }
    return n*factorial1(n-1);
}

I do not know how to measure the number of steps. That is, I do not know how the "step" is determined, so I used the instructions from the book to determine the step:

So we can compute n! by computing (n-1)! and multiplying the result by n.

I thought this is what they mean by step. For a specific example, if we follow (factor 5),

  • factorial (1) = 1 = 1 step (base register - constant time)
  • factorial (2) = 2 * factorial (1) = 2 steps
  • factorial (3) = 3 * factorial (2) = 3 steps
  • factorial (4) = 4 * factorial (3) = 4
  • factorial (5) = 5 * factorial (4) = 5

, ( n).

, , , .

function factorial2(n) {
    if (n == 0) {
        return 1;
    }
    return n*factorial2(n-1);
}

, , ():

  • factorial (0) = 1 = 1 ( - )
  • factorial (1) = 1 * factorial (0) = 2
  • ...

, O (N), , , factorial2 O (n + 1) ( 1 - ), factorial1, O (N) ( )?

+4
3

, factorial1 n = 0, , . factorial2 n = 0.

, . factorial1 - O (n), factorial2 - O (n + 1). , n (+ 1), , O (n). wikipedia Big O Notation :

... g (x), O (...), , .

, , . , n, , n. , .

? ?

+2

.

function factorial1(n) {
    r1 = (n == 1);        // one step
    if r1: { return 1; }  // second step ... will stop only if n==1
    r2 = factorial1(n-1)  // third step ... in addition to however much steps
                          //                it takes to compute the factorial1(n-1)
    r3 = n * r2;          // fourth step
    return r3;
}

, , factorial1(n) , factorial1(n-1), factorial1(1) :

T(1) = 2
T(n) = 4 + T(n-1)

4n , O (n). (.. n) .

+2

, , .

- O(N), O(N+1), O(2n+3), O(6N + -e) O(.67777N - e^67). O(N), , , O(N+1), - O(n) as it was O (n + 1 ) `.

. big-O, .   g (n) = O (f (n)), f (n) = O (k (n)) --implies- > g (n) = O (k (n))

( , google O). , .

n = O(n+1), factorial1 = O(n) --implies--> factorial1 = O(n+1)

, O (N) O (N + 1). . , , . . .

Θ, , , . :

Θ(1)        # Constant 
Θ(log(n))   # Logarithmic
Θ(n)        # Linear
Θ(n^2)      # Qudratic
Θ(n^3)      # Cubic
Θ(2^n)      # Exponential (Base 2)
Θ(n!)       # Factorial

Θ. 2 , , , . Θ .

, -0, , Θ, , O.

, , , , " O(N+1)". O(N) - " , , ". :

a function is more O(N+1) and less `O(N)`

would be equivalent to saying

a function is more "a member of the set of all functions that have linear
growth rate or less growth rate" and less "a member of the set of all 
functions that have linear or less growth rate"

This is pretty absurd, not the right thing.

+1
source

Source: https://habr.com/ru/post/1682240/


All Articles