O(n^k) faster than O(n^k') for all k, k' >= 0 and k' > k
O(n^2) will be faster than O(n^2*logn)
Note that you can ignore constants, nothing including input size can be ignored.
Thus, the complexity T(n)=n^2 + n^2logn would be the worst of the two, which is O(n^2logn) .
Little oh
A little bit about free expression - this is a guaranteed upper bound. Yes, it is called little, and it is more restrictive.
n^2 = O(n^k) for k> = 2 , but n^2 = O(n^k) for k> 2
In practice, this is Big-Oh , which occupies most of the spotlight.
What about
T(n)= n^2.001 + n^2logn ?
We have n 2.001 = n 2 * n 0.001 and n 2 * log (n).
To solve the problem, we need to find out what will be more in the end, n 0.001 or log (n).
It turns out that a function of the form n k with k > 0 will ultimately take on log(n) for a sufficiently large n .
The same holds for T(n) = O (n 2.001 ) .
In practice, log(n) will be greater than n 0.001 .
(10 3300 ) 0.001 log (10 3300 ) (1995.6 < 3300) , and a sufficiently large n in this case will be only about 10 3650 astronomical number.
Once again, 10 3650 . There are 10 82 in the universe.