How does IF affect complexity?

Let's say we have an array of 1,000,000 elements, and we go through all of them to check for something simple, for example, if the first character is “A”. In my (very small) understanding, the complexity will be O(n) , and it will take some amount of time. If I add another IF (not otherwise, if) to check, for example, if the last character is "G", how will it change the complexity? Will it double complexity and time? Like O(2n) and 2X ?

I would like not to take into account the number of calculations that different commands must perform. For example, I understand that Len () requires more computation to give us results than just comparing char, but let's say that the commands used in IF will have (almost) the same complexity.

+4
source share
3 answers

O(2n) = O(n) . Generalizing, O(kn) = O(n) , and k is a constant. Of course, with two IFs this can take twice, but the execution time will still be a linear function of the input size.

Edit : This explains, with examples, the big O notation, which is not too mathematically oriented

+11
source

Asymptotic complexity (which is used for a large use of O) does not depend on constant factors, more specifically, you can add / remove any constant factor to / from the function and it will remain equivalent (i.e. O (2n) = O (n)) .

Assuming the if statement takes a constant amount of time, it will add a constant factor to the complexity.

A "constant amount of time" means:

  • The time spent on this if-statement for a given element does not depend on the number of other elements in the array
  • Basically, if it does not call a function that somehow looks through other elements of the array or something similar to this
  • Any non-functional call to the if statement is probably right (if it contains an operator that passes through the array, which allows some language)

Thus, the 2 (constant) if statements called for each element will be O (2n), but this is O (n) (well, actually it may not be 2n, more on that in the additional note) .

See Wikipedia for more details and a more formal definition.

Note: In addition to being independent of constant factors, it is also independent of asymptotically smaller terms (terms that remain smaller no matter how high the probability of n), for example. O (n) = O (n + sqrt (n)). And big-O is just the upper bound, so saying that it is O (n 9999 ) will also be correct (although they say that in the test / exam you will probably get 0 points).

Note: A problem when not ignoring persistent factors is what is classified as a unit of work? There is no standard definition. One way is to use the operation that takes the longest, but the definition of this may not always be straightforward and will not always be particularly accurate, and you cannot generally compare the complexity of different algorithms.

+4
source

This is due to the question I posted today.

In your example, this depends on whether you can go from the first to the last element, and if you cannot do this, it also depends on the average length of each record.

If after you went through the array, you had to read each complete record to evaluate your two if statements, then your order will be O (1,000,000xN), where N is the average length of each record. IF N is a variable and this will affect the order. An example would be standard multiplication, in which we add Log (N) entries to Log (N) in length and therefore order O (Log ^ 2 (N)), or if you prefer O ((Log (N)) ^ 2).

On the other hand, if you can just check the first and last characters, then N = 2 is constant, so you can ignore it.

This is an IMPORTANT point at which you have to be careful, because, as you can decide whether your multiplier can be ignored. For example, suppose we did the addition of Log (N) log number (N / 100). Now, just because Log (N / 100) is a smaller term, this does not mean that we can ignore it. The multiplication factor cannot be ignored if it is a variable.

-2
source

Source: https://habr.com/ru/post/1484399/


All Articles