I met these two code snippets in a CTCI book,
Code Snippet 1:
int min = Integer.MAX_VALUE;
int max = Integer.MIN_VALUE;
for(int x : array) {
if (x < min) min = x;
if (x > max) max = x;
}
Code Snippet 2:
int min = Integer.MAX_VALUE;
int max = Integer.MIN_VALUE;
for(int x : array) {
if (x < min) min = x;
}
for(int x : array) {
if (x > max) max = x;
}
The book did not have a clear answer to the question of which one is faster and more efficient in terms of assembly level and compiler optimization. I believe both of them have O (n) runtimes. The first of them has one cycle with the consumption of two conditional operations, and the second - two cycles with one conditional operation.
To be technically accurate, the second runtime would be O (2N) and the first O (N), but since we omit the constants, both of them will be described as O (N). So, let's say, for a huge size N, will the constant really matter? Also, which one will lead to more optimized assembler from a compiler point of view?
EDIT: N, , 2, - , , N ?