The best case is almost never discussed. It is just not that interesting. The algorithm can always be modified to have the smallest theoretically possible best case, which is O (max (input size, output size)), simply by recognizing one specific input and obtaining the output data previously computed for this input. In the benchmarking business, this is called cheating.
The term “connected here” has the same meaning as in the rest of mathematics, namely, an arbitrary value that is no more (no less) than any element of this set.
For example, when discussing a set of sorting algorithms, we can prove that no comparison sorting algorithm has better asymptotic efficiency than O (n log n) in the worst case (and also in the average case). Thus, O (n § n) is the lower bound on the efficiency of all possible comparison algorithms based on sorting in the worst case (as well as in the average case). O (n) is another lower bound. O (n log n) is a better lower bound than O (n). And it just happens that O (n § n) is a tight lower bound, because there are actually sorting algorithms with this complexity.
There is no final upper bound on the complexity of the sorting algorithm, since an arbitrarily bad sorting algorithm can be created.
On the other hand, we can discuss a specific sorting algorithm and prove that it never exceeds a certain number of operations, which would be the upper limit of its complexity. For example, the quicksort algorithm has an upper bound of O (n 2 ). It also has an upper bound O (n 3 ). It does not have an upper bound O (n log n), because there are inputs that exceed this number of operations. The boundary O (n 2 ) is tight because it is reached for some inputs.
It is theoretically possible to discuss the lower bound in the same sense as above, but this is almost never done (this would be tantamount to discussing the complexity of the best case, which usually does not interest us).
We can also discuss the complexity of a particular problem and place both upper and lower bounds on it. How effective is the most efficient algorithm (in the worst or middle case) that solves it? (We do not discuss the best case, because the answer is not interesting, see above.) For the sorting task based on comparison, we know that both the rigid upper bound and the rigid lower bound are O (n log n), because in fact there are O (n log n) sorting algorithms, and it can be shown that a better algorithm no exists. This is not a very interesting case, because we can find the most efficient algorithm possible. For example, a problem with a satchel, the situation is more interesting. We only know that the upper bound O (2 n ) exists, because an algorithm with such complexity exists trivially (brute force). We suspect, but cannot prove, that this border is rigid. We also cannot provide any good lower bound (we suspect that there are no algorithms that solve it with polynomial complexity, but cannot prove it).