Why is the Big-O complexity of this algorithm O(n^2)?

AlgorithmTime ComplexityBig OComplexity TheoryAsymptotic Complexity

Algorithm Problem Overview


I know the big-O complexity of this algorithm is O(n^2), but I cannot understand why.

int sum = 0; 
int i = 1; j = n * n; 
while (i++ < j--) 
  sum++;

Even though we set j = n * n at the beginning, we increment i and decrement j during each iteration, so shouldn't the resulting number of iterations be a lot less than n*n?

Algorithm Solutions


Solution 1 - Algorithm

During every iteration you increment i and decrement j which is equivalent to just incrementing i by 2. Therefore, total number of iterations is n^2 / 2 and that is still O(n^2).

Solution 2 - Algorithm

big-O complexity ignores coefficients. For example: O(n), O(2n), and O(1000n) are all the same O(n) running time. Likewise, O(n^2) and O(0.5n^2) are both O(n^2) running time.

In your situation, you're essentially incrementing your loop counter by 2 each time through your loop (since j-- has the same effect as i++). So your running time is O(0.5n^2), but that's the same as O(n^2) when you remove the coefficient.

Solution 3 - Algorithm

You will have exactly n*n/2 loop iterations (or (n*n-1)/2 if n is odd). In the big O notation we have O((n*n-1)/2) = O(n*n/2) = O(n*n) because constant factors "don't count".

Solution 4 - Algorithm

Your algorithm is equivalent to

while (i += 2 < n*n) 
  ...

which is O(n^2/2) which is the same to O(n^2) because big O complexity does not care about constants.

Solution 5 - Algorithm

Let m be the number of iterations taken. Then,

i+m = n^2 - m

which gives,

m = (n^2-i)/2

In Big-O notation, this implies a complexity of O(n^2).

Solution 6 - Algorithm

Yes, this algorithm is O(n^2).

To calculate complexity, we have a table the complexities:

O(1) O(log n) O(n) O(n log n) O(n²) O(n^a) O(a^n) O(n!)

Each row represent a set of algorithms. A set of algorithms that is in O(1), too it is in O(n), and O(n^2), etc. But not at reverse. So, your algorithm realize n*n/2 sentences.

O(n) < O(nlogn) < O(n*n/2) < O(n²)

So, the set of algorithms that include the complexity of your algorithm, is O(n²), because O(n) and O(nlogn) are smaller.

For example: To n = 100, sum = 5000. => 100 O(n) < 200 O(n·logn) < 5000 (n*n/2) < 10000(n^2)

I'm sorry for my english.

Solution 7 - Algorithm

> Even though we set j = n * n at the beginning, we increment i and decrement j during each iteration, so shouldn't the resulting number of iterations be a lot less than n*n?

Yes! That's why it's O(n^2). By the same logic, it's a lot less than n * n * n, which makes it O(n^3). It's even O(6^n), by similar logic.

big-O gives you information about upper bounds.

I believe you are trying to ask why the complexity is theta(n) or omega(n), but if you're just trying to understand what big-O is, you really need to understand that it gives upper bounds on functions first and foremost.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionOmar NView Question on Stackoverflow
Solution 1 - AlgorithmMiljen MikicView Answer on Stackoverflow
Solution 2 - AlgorithmBen RubinView Answer on Stackoverflow
Solution 3 - Algorithmm3tikn0bView Answer on Stackoverflow
Solution 4 - AlgorithmSalvador DaliView Answer on Stackoverflow
Solution 5 - AlgorithmUjjwal AryanView Answer on Stackoverflow
Solution 6 - AlgorithmFrancisco Manuel Garca BotellaView Answer on Stackoverflow
Solution 7 - AlgorithmdjechlinView Answer on Stackoverflow