Big-O complexity of a piece of code
AlgorithmTime ComplexityAlgorithm Problem Overview
I have a question in algorithm design about complexity. In this question a piece of code is given and I should calculate this code's complexity. The pseudo-code is:
for(i=1;i<=n;i++){
j=i
do{
k=j;
j = j / 2;
}while(k is even);
}
I tried this algorithm for some numbers. and I have gotten different results. for example if n = 6 this algorithm output is like below
i = 1 -> executes 1 time
i = 2 -> executes 2 times
i = 3 -> executes 1 time
i = 4 -> executes 3 times
i = 5 -> executes 1 time
i = 6 -> executes 2 times
It doesn't have a regular theme, how should I calculate this?
Algorithm Solutions
Solution 1 - Algorithm
The upper bound given by the other answers is actually too high. This algorithm has a O(n) runtime, which is a tighter upper bound than O(n*logn).
Proof: Let's count how many total iterations the inner loop will perform.
The outer loop runs n
times. The inner loop runs at least once for each of those.
- For even
i
, the inner loop runs at least twice. This happensn/2
times. - For
i
divisible by 4, the inner loop runs at least three times. This happensn/4
times. - For
i
divisible by 8, the inner loop runs at least four times. This happensn/8
times. - ...
So the total amount of times the inner loop runs is:
n + n/2 + n/4 + n/8 + n/16 + ... <= 2n
The total amount of inner loop iterations is between n
and 2n
, i.e. it's Θ(n).
Solution 2 - Algorithm
You always assume you get the worst scenario in each level.
now, you iterate over an array with N elements, so we start with O(N)
already.
now let's say your i
is always equals to X
and X
is always even (remember, worst case every time).
how many times you need to divide X
by 2
to get 1
? (which is the only condition for even numbers to stop the division, when they reach 1).
in other words, we need to solve the equation
X/2^k = 1
which is X=2^k
and k=log<2>(X)
this makes our algorithm take O(n log<2>(X))
steps, which can easly be written as O(nlog(n))
Solution 3 - Algorithm
For such loop, we cannot separate count of inner loop and outer loop -> variables are tighted!
We thus have to count all steps.
In fact, for each iteration of outer loop (on i
), we will have
1 + v_2(i) steps
where v_2
is the 2-adic valuation (see for example : http://planetmath.org/padicvaluation) which corresponds to the power of 2
in the decomposition in prime factor of i
.
So if we add steps for all i
we get a total number of steps of :
n_steps = \sum_{i=1}^{n} (1 + v_2(i))
= n + v_2(n!) // since v_2(i) + v_2(j) = v_2(i*j)
= 2n - s_2(n) // from Legendre formula (see http://en.wikipedia.org/wiki/Legendre%27s_formula with `p = 2`)
We then see that the number of steps is exactly :
n_steps = 2n - s_2(n)
As s_2(n)
is the sum of the digits of n
in base 2
, it is negligible (at most log_2(n)
since digit in base 2
is 0 or 1 and as there is at most log_2(n)
digits) compared to n
.
So the complexity of your algorithm is equivalent to n
:
n_steps = O(n)
which is not the O(nlog(n))
stated in many other solutions but a smaller quantity!
Solution 4 - Algorithm
lets start with worst case:
if you keep dividing with 2 (integral) you don't need to stop until you
get to 1. basically making the number of steps dependent on bit-width,
something you find out using two's logarithm. so the inner part is log n.
the outer part is obviously n, so N log N
total.
Solution 5 - Algorithm
A do
loop halves j
until k
becomes odd. k
is initially a copy of j
which is a copy of i
, so do
runs 1 + power of 2
which divides i
:
- i=1 is odd, so it makes 1 pass through
do
loop, - i=2 divides by 2 once, so 1+1,
- i=4 divides twice by 2, so 1+2, etc.
That makes at most 1+log(i) do
executions (logarithm with base 2).
The for
loop iterates i
from 1 through n
, so the upper bound is n
times (1+log n), which is O(n log n).