Is O (log (n)) complexity equivalent to O (sqrt (n))?

My professor just taught us that any operation that reduces the input length has O (log (n)) complexity, usually the thumb. Why is it not O (sqrt (n)), both of them are not equivalent?

+28
source share
5 answers

They are not equivalent: sqrt (N) will increase much more than log 2 (N). There is no C constant, so you will have sqrt (N) <C.log (N) for all N values ​​greater than some minimum value.

An easy way to capture this is that log 2 (N) will be a value close to the number (binary) digits of N, while sqrt (N) will be a number that has half the number of digits that has N. Or, to say that with equality:

    log 2 (N) = 2log 2 (sqrt (N))

So, you need to take the logarithm (!) Sqrt (N) to bring it to the same order of complexity as log 2 (N).

For example, for a binary number with 11 digits, 0b10000000000 (= 2 10 ), the square root is 0b100000, but the logarithm is only 10.

+43
source

Assuming natural logarithm(otherwise just multiply by a constant), we have

lim {n->inf} log n / sqrt(n) = (inf / inf)

                        =  lim {n->inf} 1/n / 1/(2*sqrt(n)) (by L'Hospital)
                        =  lim {n->inf} 2*sqrt(n)/n
                        =  lim {n->inf} 2/sqrt(n)
                        =  0 < inf

https://en.wikipedia.org/wiki/Big_O_notation O(.) , , log n = O(sqrt(n)),

, log n sqrt(n) n.

enter image description here

+10

, .

@trincot . . ,

any operation that halves the length of the input has an O(log(n)) complexity

,

any operation that reduces the length of the input by 2/3rd, has a O(log3(n)) complexity
any operation that reduces the length of the input by 3/4th, has a O(log4(n)) complexity
any operation that reduces the length of the input by 4/5th, has a O(log5(n)) complexity
So on ...

(B-1)/Bth. O(logB(n))

N:B: O(logB(n)) B n

+3

, ; ,

   O(n**k) > O(log(n, base)) 

k > 0 base > 1 (k = 1/2 sqrt).

O(f(n)), n, . , O :

  O(n**k) = O(log(n, base)) 

C ,

  O(n**k) <= C * O(log(n, base)) 

n; , log(n, base) 0, n, ):

  lim(n**k/log(n, base)) = C 
  n->+inf

, L'Hospital Rule, .. :

  lim(n**k/log(n)) = 

  lim([k*n**(k-1)]/[ln(base)/n]) =

  ln(base) * k * lim(n**k) = +infinity

, C , O(n**k) < C*log(n, base) , ,

 O(n**k) > O(log(n, base)) 
+2

No, it is not. When we deal with the complexity of time, we think of entry as a very large number. so, let n = 2 ^ 18. Now for sqrt (n) the number of operations will be 2 ^ 9, and for log (n) it will be equal to 18 (here we consider log with base 2). Obviously, 2 ^ 9 is much more than 18. Thus, we can say that O (log n) is less than O (sqrt n).

+1
source

Source: https://habr.com/ru/post/1668794/


All Articles