First of all, your definition of big-O is a bit confused. You speak:
I think this means that there is no c that satisfies this inequality for all values ββof n.
In fact, you need to choose a value c that satisfies the inequality for any value n .
In any case, to answer the question:
I do not believe that the statement in the question is true ... Let's see if we can think of a counter example, where f (n) & ne; O (g (n)) and g (n)? O (e (n)).
Note: I will use n and x interchangeably, as it is easier for me to think this.
We would have to come up with two functions that constantly cross each other when they go to infinity. Not only that, but they would have to continue to cross each other, regardless of the constant c , which we repeatedly execute.
So this leaves me in mind that functions will alternate between two different time complexities.
Look at the function that alternates between y = x and y = x^2 :
f(x) = .2 (x * sin(x) + x^2 * (1 - sin(x)) )

Now, if we create a similar function with a slightly offset wobble:
g(x) = .2 (x * cos(x) + x^2 * (1 - cos(x)) )
Then these two functions will continue to cross paths to each other indefinitely.
For any number n you choose, no matter how tall, x1 will be greater than n , so f(x) = x^2 and g(x) = x . Similarly, there will be x2 such that g(x) = x^2 and f(x) = x .
At these points you cannot select any c1 or c2 that guarantee that f(x) < c1 * g(x) or g(x) < c2 * f(x) .
In conclusion, f (n)? Ne O (g (n)) does not mean that g (n) = O (f (n)).