1 i ← 1
2 while i < n/4
3 do
4 j ← 2i
5 while j < n
6 do
7 j ← j + 1
8 i ← i + 1
b) 1 i ← n
2 while i > 1
3 do
4 j ← i
5 while j < n
6 do
7 j ← 2j
8 i ← i − 1
c) 1 i ← 1
2 while i < n
3 do
4 j ← 0
5 while j ≤ i
6 do
7 j ← j + 1
8 i ← 2i
Given these three programs, what would be the easiest approach in finding the time complexity for each one of these? I can tell that the first one is probably going to be O(n^2). But is there an easy approach to these to solve it consistently? I have an exam about it tomorrow.
(I'll ignore off-by-1, rounding etc. in the summation limits)
A)
In the inner loop you do
n - 2i - 1
operations. In the outer loop you don/4 - 1
. So the complexity is:So your answer to the first one is correct.
B)
In the inner loop,
j
gets doubled every time, so grows exponentially; therefore the inner loop takes logarithmic time -log (n / i)
. (base 2 but ignore WLOG). The division byi
inside is becausej
starts fromi
, so is equivalent to starting from 1 and growing up ton / i
. In the outer loop you don - 1
ops.(Last step is from Stirling's approximation)
C)
In the inner loop you do
i + 1
ops. In the outer loopi
grows exponentially from 1 ton
, solog n
.N.B. I notice the significant discrepancy with TypeKazt's answers; this illustrates you can't simply "multiply" the inner loop's complexity with that of the outer loop, and that the boundary conditions for the inner loop are deterministically important.
EDIT: As added proof to my bold accusations against TypeKazt (hehe sorry buddy), here is some test data I got from a C# implementation:
As you can see, the complexity of A (linear scale) is
O(n^2)
, whereas those of B and C (log-log scales) are linear.Code:
P.S. I know SO's policy is not to babysit people, but I for one learn well by examples, so I hope this will help you to understand a simple analytical approach to complexity analysis.