Give a big-O estimate for the following program
i=1
while ( i <= n ) {
i = 2*i
}
by drawing a quick table, comparing the value of i to each iteration we see that:
if n=5 we need 6 iterations
if n=7 we need 8 iterations, and so on...
So I say:
how many iterations do we need until 2^k > n (where k is the number of iterations)
2^k > n
log(2^k) > log(n)
k > log(n)
where log is of base 2.
But I am stuck now... how do I get the big-O from this?