I face a question: T(N) = T(sqrt(N)) + 5.
I am wondering can I solve it in this way?
T(N) = O(sqrt(N)) + O(5)
Since O(5) = O(1) is a constant, we can ignore it.
So the big O notation of T(N) is O(N^(1/2)).
Or can I just say its notation is O(N) as there is no big difference between O(N) and O(sqrt(N)).
Thank you!
Edit: I made a mistake in the original answer, assuming that n is a power of 2 and reducing the recurrence to 1, 2, 4, ... n, which is wrong. I apologize for the misleading. Here is the updated answer.
From,
T(n) = T(sqrt(n)) + 5,
we can also write it as:
T(n) = T(n^(1/2)) + 5,
then by recurrence:
T(n^(1/2)) = T(n^(1/4)) + 5,
T(n^(1/4)) = T(n^(1/8)) + 5,
...
T(n^(2^-m)) = T(n^(2^-(m+1)) + 5,
this doesn't show a constant where we can stop. Therefore we need to substitute n.
Try:
n = 2^(2^m),
where we have
m = log log n
starting from m = 0, which is n = 2,
then we have:
T(n) = T(2) + 5 + 5 + ... + 5,
how many 5s are there?
We count like this:
2^(2^0), 2^(2^1), 2^(2^2), ... 2^(2^m)
So there are m 5s, where m = log log n. So
T(n) = T(2) + 5 log log n,
which is,
T(n) = O(log log n).
(For neatness, let's replace the 5 with a constant c
)
Substituting this function into itself multiple times, we can spot a pattern emerging:
When do we stop iterating? When the stopping condition is met. Take this to be n = 2
(not 1 as is usually the case, since the argument is asymptotic to n = 1
):
So the final cost of this function is:
Note that the constant c
(= 5) does not matter in terms of asymptotic complexity. (And also that the result is not simply log n
but log log n
)
EDIT: if you were to choose a different stopping condition n = a, a > 1
, then the above step would become:
Which only differs by a constant from the original result.