Time complexity of nested for-loop

2018-12-31 20:51发布

问题:

I need to calculate the time complexity of the following code:

for (i = 1; i <= n; i++)
{
  for(j = 1; j <= i; j++)
  {
   // Some code
  }
}

Is it O(n^2)?

回答1:

Yes, nested loops are one way to quickly get a big O notation.

Typically (but not always) one loop nested in another will cause O(n²).

Think about it, the inner loop is executed i times, for each value of i. The outer loop is executed n times.

thus you see a pattern of execution like this: 1 + 2 + 3 + 4 + ... + n times

Therefore, we can bound the number of code executions by saying it obviously executes more than n times (lower bound), but in terms of n how many times are we executing the code?

Well, mathematically we can say that it will execute no more than n² times, giving us a worst case scenario and therefore our Big-Oh bound of O(n²). (For more information on how we can mathematically say this look at the Power Series)

Big-Oh doesn\'t always measure exactly how much work is being done, but usually gives a reliable approximation of worst case scenario.


4 yrs later Edit: Because this post seems to get a fair amount of traffic. I want to more fully explain how we bound the execution to O(n²) using the power series

From the website: 1+2+3+4...+n = (n² + n)/2 = n²/2 + n/2. How, then are we turning this into O(n²)? What we\'re (basically) saying is that n² >= n²/2 + n/2. Is this true? Let\'s do some simple algebra.

  • Multiply both sides by 2 to get: 2n² >= n² + n?
  • Expand 2n² to get:n² + n² >= n² + n?
  • Subtract n² from both sides to get: n² >= n?

It should be clear that n² >= n (not strictly greater than, because of the case where n=0 or 1), assuming that n is always an integer.

Actual Big O complexity is slightly different than what I just said, but this is the gist of it. In actuality, Big O complexity asks if there is a constant we can apply to one function such that it\'s larger than the other, for sufficiently large input (See the wikipedia page)



回答2:

A quick way to explain this is to visualize it.

if both i and j are from 0 to N, it\'s easy to see O(N^2)

O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O
O O O O O O O O

in this case, it\'s:

O
O O
O O O
O O O O
O O O O O
O O O O O O
O O O O O O O
O O O O O O O O

This comes out to be 1/2 of N^2, which is still O(N^2)



回答3:

Indeed, it is O(n^2). See also a very similar example with the same runtime here.



回答4:

First we\'ll consider loops where the number of iterations of the inner loop is independent of the value of the outer loop\'s index. For example:

 for (i = 0; i < N; i++) {
     for (j = 0; j < M; j++) {
         sequence of statements
      }
  }

The outer loop executes N times. Every time the outer loop executes, the inner loop executes M times. As a result, the statements in the inner loop execute a total of N * M times. Thus, the total complexity for the two loops is O(N2).



回答5:

On the 1st iteration of the outer loop (i = 1), the inner loop will iterate 1 times On the 2nd iteration of the outer loop (i = 2), the inner loop will iterate 2 time On the 3rd iteration of the outer loop (i = 3), the inner loop will iterate 3 times
.
.
On the FINAL iteration of the outer loop (i = n), the inner loop will iterate n times

So, the total number of times the statements in the inner loop will be executed will be equal to the sum of the integers from 1 to n, which is:

((n)*n) / 2 = (n^2)/2 = O(n^2) times