Asymptotic complexity (which is used for a large use of O) does not depend on constant factors, more specifically, you can add / remove any constant factor to / from the function and it will remain equivalent (i.e. O (2n) = O (n)) .
Assuming the if statement takes a constant amount of time, it will add a constant factor to the complexity.
A "constant amount of time" means:
- The time spent on this if-statement for a given element does not depend on the number of other elements in the array
- Basically, if it does not call a function that somehow looks through other elements of the array or something similar to this
- Any non-functional call to the if statement is probably right (if it contains an operator that passes through the array, which allows some language)
Thus, the 2 (constant) if statements called for each element will be O (2n), but this is O (n) (well, actually it may not be 2n, more on that in the additional note) .
See Wikipedia for more details and a more formal definition.
Note: In addition to being independent of constant factors, it is also independent of asymptotically smaller terms (terms that remain smaller no matter how high the probability of n), for example. O (n) = O (n + sqrt (n)). And big-O is just the upper bound, so saying that it is O (n 9999 ) will also be correct (although they say that in the test / exam you will probably get 0 points).
Note: A problem when not ignoring persistent factors is what is classified as a unit of work? There is no standard definition. One way is to use the operation that takes the longest, but the definition of this may not always be straightforward and will not always be particularly accurate, and you cannot generally compare the complexity of different algorithms.
source share