In order for the algorithm to have constant complexity in time, its execution time must remain constant, since the number of inputs n grows. If your function at n = 1 and n = 1000000 performs different time intervals for running, your function is not O(1) , that is, it does not have constant time complexity.
Let me calculate how many steps the first function takes to complete:
n / 2 x = 1 → x = log (n)
However, secondly, theoretically it will continue to divide
n by 2 forever, but in fact it will stop after some steps
log(n) + c , in which case the constant will be omitted, and the complexity will again be
log(n) .
source share