The Big O values for complexity are determined using the number of operations of the Turing machine and therefore can describe any algorithm. The selection problem N + 1 describes an inefficient relational algorithm (query), which always requires the operation N + 1 for each record. Since this query is an algorithm, you can analyze its complexity.
O (N + 1) = O (N)
This means that you have linear complexity. Now, if we used the correct algorithm, we only need one operation (selection) for each record for each of the two tables. The difficulty would be:
O (2) = O (1)
This algorithm has constant complexity. This shows that by analyzing the complexity of both algorithms, you will see which one suffers from the problem of choosing N + 1.
Is this clear?
source
share