Either f (n) = O (g (n)) or g (n) = O (f (n))

I am trying to prove that this is true for any function f and g with a domain and co-domain N. I saw how this is proved using constraints, but apparently you can also prove it without them.

In essence, I am trying to prove that "if f (n) does not have a large O from g (n), then g (n) must have a large O from f (n)." I have a problem, you are trying to understand what β€œf does not really matter O of g”.

According to the formal definition of big-O, if f (n) = O (g (n)), then n> = N β†’ f (n) <= cg (n) for some N and a constant c. If f (n)! = O (g (n)), I think it means that there is no c satisfying this inequality for all values ​​of n. However, I do not see what I can do to use this fact to prove that g (n) = O (f (n)). This does not prove that c 'exists for g (n) <= c'f (n) in order to successfully prove the question.

+6
source share
2 answers

Not true. Let f(n) = 1 if n is odd and 0 otherwise, and g(n) = 1 if n even and zero otherwise.

To say that f is O(g) will say that there exists a constant C > 0 and N > 0 such that f(n) <= C g(n) n > N f(n) <= C g(n) follows from n > N Let n = 2 * N + 1 , so n is odd. Then f(n) = 1 , but g(n) = 0 , so f(n) <= C * g(n) impossible. So f is O(g) invalid.

Similarly, we can show that g is O(f) is false.

+4
source

First of all, your definition of big-O is a bit confused. You speak:

I think this means that there is no c that satisfies this inequality for all values ​​of n.

In fact, you need to choose a value c that satisfies the inequality for any value n .

In any case, to answer the question:

I do not believe that the statement in the question is true ... Let's see if we can think of a counter example, where f (n) & ne; O (g (n)) and g (n)? O (e (n)).

Note: I will use n and x interchangeably, as it is easier for me to think this.

We would have to come up with two functions that constantly cross each other when they go to infinity. Not only that, but they would have to continue to cross each other, regardless of the constant c , which we repeatedly execute.

So this leaves me in mind that functions will alternate between two different time complexities.

Look at the function that alternates between y = x and y = x^2 :

 f(x) = .2 (x * sin(x) + x^2 * (1 - sin(x)) ) 

Graph 1

Now, if we create a similar function with a slightly offset wobble:

 g(x) = .2 (x * cos(x) + x^2 * (1 - cos(x)) ) 

Then these two functions will continue to cross paths to each other indefinitely.

For any number n you choose, no matter how tall, x1 will be greater than n , so f(x) = x^2 and g(x) = x . Similarly, there will be x2 such that g(x) = x^2 and f(x) = x .

At these points you cannot select any c1 or c2 that guarantee that f(x) < c1 * g(x) or g(x) < c2 * f(x) .

In conclusion, f (n)? Ne O (g (n)) does not mean that g (n) = O (f (n)).

+5
source

Source: https://habr.com/ru/post/948425/


All Articles