Asymptotic complexity of logarithmic functions

I know that in terms of complexity, O (logn) is faster than O (n), which is faster than O (nlogn), which is faster than O (n2). But what about O (n2) and O (n2log), or O (n2.001) and O (n2log):

T1(n)=n^2 + n^2logn 

What is the big oh and omega of this feature? Also, what is little about? compared with:

 T2(n)=n^2.001 + n^2logn 

Is there a difference in big Oh now? I am having trouble understanding how to compare logn with permissions n. As in, logn is approximately n ^ 0.000000 ... 1 or n ^ 1.000000 ... 1?

+5
source share
2 answers

O(n^k) faster than O(n^k') for all k, k' >= 0 and k' > k

O(n^2) will be faster than O(n^2*logn)

Note that you can ignore constants, nothing including input size can be ignored.

Thus, the complexity T(n)=n^2 + n^2logn would be the worst of the two, which is O(n^2logn) .


Little oh

A little bit about free expression - this is a guaranteed upper bound. Yes, it is called little, and it is more restrictive.

n^2 = O(n^k) for k> = 2 , but n^2 = O(n^k) for k> 2

In practice, this is Big-Oh , which occupies most of the spotlight.


What about T(n)= n^2.001 + n^2logn ?

We have n 2.001 = n 2 * n 0.001 and n 2 * log (n).

To solve the problem, we need to find out what will be more in the end, n 0.001 or log (n).

It turns out that a function of the form n k with k > 0 will ultimately take on log(n) for a sufficiently large n .

The same holds for T(n) = O (n 2.001 ) .

In practice, log(n) will be greater than n 0.001 .

(10 3300 ) 0.001 log (10 3300 ) (1995.6 < 3300) , and a sufficiently large n in this case will be only about 10 3650 astronomical number.

Once again, 10 3650 . There are 10 82 in the universe.

+3
source

T(n)=n^2 + n^2logn

What is the big oh and omega of this feature? Also, what is little about?

Quote from previous answer:

Don't forget that the big O notation is a collection. O(g(n)) - the set of the entire function f such that f does not grow faster than g , is formally the same, says that there exists C and n0 such that for each n >= n0 we have |f(n)| <= C|g(n)| |f(n)| <= C|g(n)| . The expression f(n) = O(g(n)) is an abbreviation for the fact that f(n) is in the set O(g(n))

You can also think of large O as and small o as < ( link ). Thus, you take care to find a corresponding larger O-bond than a small o-bond. In your case, this is even suitable for using the big theta, which = . Since n^2 log n dominates n^2 , it is true that

 T1(n)=n^2 + n^2logn = Ө(n^2 logn) 

Now the second part. log n grows so slowly that even n^e, e > 0 dominates it. Interestingly, you can even prove that lim n^e/(logn)^k=inf when n goes to infinity. It follows that n^0.001 dominates log n , then

 T2(n)=n^2.001 + n^2logn = Ө(n^2.001). 

If f(n) = Ө(g(n)) it is also true that f(n) = O(g(n)) so answer your question:

  • T1(n)=O(n^2 logn)
  • T2(n)=O(n^2.001)
+1
source

Source: https://habr.com/ru/post/1232535/


All Articles