I am trying to figure out the same connection in big-O terms for this snippet:
for(int i = 1 ; i <= n ; i++) {
for(int j = 1; j <= i*i ; j++) {
if (j% i == 0){
for(int k = 0 ; k<j ; k++ )
sum++;
}
}
}
If we start with the innermost loop, in the worst case, run k = n ^ 2 times, which will take into account O (N ^ 2). The if statement will be true every time j = m * i, where m is an arbitrary constant. Since j runs from 1 to i ^ 2, this will happen when m = {1, 2, ..., i}, which means that it will be true I times, and I can be no more than n, so the worst case will be m = {1,2, ..., n} = n times. The second cycle should have the worst case O (N ^ 2) if i = n. The outer loop has the worst O (N) complexity.
I argue that this will be combined as follows: O (N ^ 2) for the inner loop * O (N ^ 2) for the second loop * O (N ^ 2) for the outer loop gives the worst time complexity O (N ^ 5)
However, the if-operator ensures that we only reach the inner loop n times, not n ^ 2. But regardless of this, we still need to go through the outer loops n * n ^ 2 times. Does if-test affect the worst time complexity for a fragment?
Edit: Fixed for j for i ^ 2, not i.
source
share