How can we prove that the execution time constraint of the algorithm is tight?

Suppose we can prove that an algorithm called with input of size n starts in O(f(n)) .

I want to prove that this time limit is limited. Two questions:

  • Is it not enough to introduce a special input and show that the operating time is at least f(n) ?
  • I read that one opportunity to prove tightness is to "reduce sorting." I have no idea what is meant by this.
+6
source share
2 answers

Wouldn't it be enough to indicate a special input and show that the work time is at least f (n)?

Yes, assuming you are talking about the worst difficulty complexity .
If you talk about the complexity of the worst case — and you proved that it works in O(f(n)) , if you find an input that is “worse” than C*f(n)) for some constant C — you have effectively proved that the algorithm (in the worst case, performance) is in Ω(f(n)) , and since O(f(n)) [intersection] Ω(f(n)) = Theta(f(n)) , this means that your algorithm works in Theta(f(n)) in the worst case analysis.
Note that this will actually be a “family” of examples, because if I ask for “yes”, but this is only true for small values ​​of n , and not for n>N (for some n ), you can show me that this the family of examples also covers the case where n>N and is still valid.

Symmetrically, if you prove that the algorithm has the best performance Ω(f(n)) , and you find some input that works “better” (faster) than C*f(n)) for some constant C , you effectively proved that the Theta(f(n)) algorithm in the best case analysis.


This trick does NOT work to analyze the average case - where you have to calculate the expected lead time, and a special example will not help.

I read that one opportunity to prove tightness is to "reduce sorting to it." I have no idea what is meant by this.

This is done in order to prove a much stronger statement that there is no algorithm (in general) that can solve a certain problem at the right time . Its general use is to suggest that there is some black box algorithm A that runs at some o(g(n)) time, and then use A to build a sorting algorithm that runs at o(nlogn) time. However, since sorting is an Ω(nlogn) problem, we have a contradiction, and we can conclude that the assumptions about A are incorrect and cannot be satisfied at the right time.

This helps us to prove a stronger statement: not only the OUR algorithm has this lower bound, but the ALL algorithms that solve a particular problem have this lower bound.

+5
source

ad 1: Yes, but you should be able to find such an input of size n for any n. An example of size 3, which is processed in 9 steps, actually does not prove anything.

ad 2: If you can change the sequence of elements so that your algorithm efficiently sorts it (you get a sorted sequence after some processing of the output), you reduced its sorting. And since sorting (by comparison) cannot be faster than O (n log (n)), this can be used to prove that your algorithm cannot be faster than O (n log (n)).

Of course, the input and output processing functions cannot be slower or equal to O (n log (n)) so that this argument is valid, otherwise you could sort the array and prove that O (1) is an algorithm that simply returns the input array to in fact, at least O (n log (n)) :).

+1
source

Source: https://habr.com/ru/post/985994/


All Articles