Generic form: T(n) = aT(n/b) + f(n)
So i must compare n^logb(a) with f(n)
if n^logba
> f(n)
is case 1 and T(n)=Θ(n^logb(a))
if n^logba
< f(n)
is case 2 and T(n)=Θ((n^logb(a))(logb(a)))
Is that correct? Or I misunderstood something?
And what about case 3? When its apply?
I think you have misunderstood it.
if n^logba > f(n) is case 1 and T(n)=Θ(n^logb(a))
Here you should not be worried about f(n) as result you are getting is T(n)=Θ(n^logb(a)).
f(n) is part of T(n) ..and if you get the result T(n) then that value will be inclusive of f(n).
so, There is no need to consider that part.
Let me know if you are not clear.
Note: I understand that this is too late to answer this question. I am just putting it here for the future generations :)
Master Theorem for Solving Recurrences
Recurrences occur in a divide and conquer strategy of solving complex problems.
What it Solves?
- It solves the recurrences of form T(n) = aT(n/b) + f(n).
- a should be greater than or equal to 1. Which means that the problem is at least reduced to a smaller sub problem once. At least one recursion is needed
- b should be greater than 1. Which means at every recursion, the size of the problem is reduced to a smaller size. If b is not greater than 1, that means our sub problems are not of smaller size.
- f(n) must be positive for relatively larger values of n.
Consider the below image:
Lets say we have a problem of size n to be solved. At each step the problem can be divided into a sub problems and each sub problem is of smaller size, where the size is reduced by a factor of b.
The above statement in simple words means that a problem of size n can be divided into a sub problems of relatively smaller sizes n/b.
Also, the above diagram shows that at the end when we have divided the problems multiple times, each sub problem would be so small that it can be solved in constant time.
For the below derivation consider log to the base b
Let us say that H is the height of the tree, then H = logn.
Number of leaves = a^logn.
- Total work done at Level 1 : f(n)
- Total work done at Level 2 : a * f(n/b)
- Total work done at Level 1 : a * a * f(n/b2)
- Total work done at last Level : number of leaves * θ(1). This equals to n^loga
Three cases of Master Theorem
Case 1:
Now let us assume that the cost of operation is increasing by a significant factor at each level and by the time we reach the leaf level the value of f(n) becomes polynomially smaller than the value n^loga. Then the overall running time will be heavily dominated by the cost of the last level. Hence T(n) = θ(n^loga)
Case 2:
Let us assume that the cost of the operation on each level is roughly equal. In that case f(n) is roughly equal to n^loga. Hence, the total running time would be f(n) times the total number of levels.
T(n) = θ(n^loga * logn) where k can be >=0. Where logn would be the height of a tree for k >= 0
Case 3:
Let us assume that the cost of the operation on each level is decreasing by a significant factor at each level and by the time we reach the leaf level the value of f(n) becomes polynomially larger than the value n^loga. Then the overall running time will be heavily dominated by the cost of the first level. Hence T(n) = θ(f(n))
If you are interested in more detailed reading and perhaps some example to practice. You can always visit my blog entry for the same. Master Method to Solve Recurrences