how to calculate binary search complexity

AlgorithmSearchTime ComplexityBinary Search

Algorithm Problem Overview


I heard somebody say that since binary search halves the input required to search hence it is log(n) algorithm. Since I am not from a mathematics background I am not able to relate to it. Can somebody explain it in a little more detail? Does it have to do something with the logarithmic series?

Algorithm Solutions


Solution 1 - Algorithm

Here a more mathematical way of seeing it, though not really complicated. IMO much clearer as informal ones:

The question is, how many times can you divide N by 2 until you have 1? This is essentially saying, do a binary search (half the elements) until you found it. In a formula this would be this:

> 1 = N / 2x

multiply by 2x:

> 2x = N

now do the log2:

> log2(2x)    = log2 N
x * log2(2) = log2 N
x * 1         = log2 N

this means you can divide log N times until you have everything divided. Which means you have to divide log N ("do the binary search step") until you found your element.

Solution 2 - Algorithm

For Binary Search, T(N) = T(N/2) + O(1) // the recurrence relation

Apply Masters Theorem for computing Run time complexity of recurrence relations : T(N) = aT(N/b) + f(N)

Here, a = 1, b = 2 => log (a base b) = 1

also, here f(N) = n^c log^k(n) //k = 0 & c = log (a base b)

So, T(N) = O(N^c log^(k+1)N) = O(log(N))

Source : http://en.wikipedia.org/wiki/Master_theorem

Solution 3 - Algorithm

T(n)=T(n/2)+1

T(n/2)= T(n/4)+1+1

Put the value of The(n/2) in above so T(n)=T(n/4)+1+1 . . . . T(n/2^k)+1+1+1.....+1

=T(2^k/2^k)+1+1....+1 up to k

=T(1)+k

As we taken 2^k=n

K = log n

So Time complexity is O(log n)

Solution 4 - Algorithm

It doesn't half search time, that wouldn't make it log(n). It decreases it logarithmicly. Think about this for a moment. If you had 128 entries in a table and had to search linearly for your value, it would probably take around 64 entries on average to find your value. That's n/2 or linear time. With a binary search, you eliminate 1/2 the possible entries each iteration, such that at most it would only take 7 compares to find your value (log base 2 of 128 is 7 or 2 to the 7 power is 128.) This is the power of binary search.

Solution 5 - Algorithm

The time complexity of the binary search algorithm belongs to the O(log n) class. This is called big O notation. The way you should interpret this is that the asymptotic growth of the time the function takes to execute given an input set of size n will not exceed log n.

This is just formal mathematical lingo in order to be able to prove statements, etc. It has a very straightforward explanation. When n grows very large, the log n function will out-grow the time it takes to execute the function. The size of the "input set", n, is just the length of the list.

Simply put, the reason binary search is in O(log n) is that it halves the input set in each iteration. It's easier to think about it in the reverse situation. On x iterations, how long list can the binary search algorithm at max examine? The answer is 2^x. From this we can see that the reverse is that on average the binary search algorithm needs log2 n iterations for a list of length n.

If why it is O(log n) and not O(log2 n), it's because simply put again - Using the big O notation constants don't count.

Solution 6 - Algorithm

Let's say the iteration in Binary Search terminates after k iterations. At each iteration, the array is divided by half. So let’s say the length of the array at any iteration is n At Iteration 1,

Length of array = n

At Iteration 2,

Length of array = n⁄2

At Iteration 3,

Length of array = (n⁄2)⁄2 = n⁄22

Therefore, after Iteration k,

Length of array = n⁄2k

Also, we know that after After k divisions, the length of the array becomes 1 Therefore

Length of array = n⁄2k = 1
=> n = 2k

Applying log function on both sides:

=> log2 (n) = log2 (2k)
=> log2 (n) = k log2 (2)
As (loga (a) = 1)

Therefore,

As (loga (a) = 1)
k = log2 (n)

Hence the time complexity of Binary Search is

log2 (n)

Solution 7 - Algorithm

⌊log₂(n) + 1⌋ is the maximum number of comparisons that are required to find something in a binary search. The average case makes approximately log₂(n) - 1 comparisons. Here's more info:

http://en.wikipedia.org/wiki/Binary_search#Performance

Solution 8 - Algorithm

Here is wikipedia entry

If you look at the simple iterative approach. You are just eliminating half of the elements to be searched for until you find the element you need.

Here is the explanation of how we come up with the formula.

Say initially you have N number of elements and then what you do is ⌊N/2⌋ as a first attempt. Where N is sum of lower bound and upper bound. The first time value of N would be equal to (L + H), where L is the first index (0) and H is the last index of the list you are searching for. If you are lucky, the element you try to find will be in the middle [eg. You are searching for 18 in the list {16, 17, 18, 19, 20} then you calculate ⌊(0+4)/2⌋ = 2 where 0 is lower bound (L - index of the first element of the array) and 4 is the higher bound (H - index of the last element of the array). In the above case L = 0 and H = 4. Now 2 is the index of the element 18 that you are searching found. Bingo! You found it.

If the case was a different array{15,16,17,18,19} but you were still searching for 18 then you would not be lucky and you would be doing first N/2 (which is ⌊(0+4)/2⌋ = 2 and then realize element 17 at the index 2 is not the number you are looking for. Now you know that you don’t have to look for at least half of the array in your next attempt to search iterative manner. Your effort of searching is halved. So basically, you do not search half the list of elements that you searched previously, every time you try to find the element that you were not able to find in your previous attempt.

So the worst case would be

[N]/2 + [(N/2)]/2 + [((N/2)/2)]/2.....
i.e:

N/21 + N/22 + N/23 +..... + N/2x …..

until …you have finished searching, where in the element you are trying to find is at the ends of the list.

That shows the worst case is when you reach N/2x where x is such that 2x = N

In other cases N/2x where x is such that 2x < N Minimum value of x can be 1, which is the best case.

Now since mathematically worst case is when the value of
2x = N
=> log2(2x) = log2(N)
=> x * log2(2) = log2(N)
=> x * 1 = log2(N)
=> More formally ⌊log2(N)+1⌋

Solution 9 - Algorithm

Here is the solution using the master theorem, with readable LaTeX.

For each recurrence in the recurrence relation for binary search, we convert the problem into one subproblem, with runtime T(N/2). Therefore:

T(n) = T(n/2)+1

Substituting into the master theorem, we get:

T(n) = aT(n/b) + f(n)

Now, because logba is 0 and f(n) is 1, we can use the second case of the master theorem because:

f(n) = O(1) = O(n0) = O(nlogba)

This means that:

T(n)=O(nlogbalogn) = O(n0logn) = O(logn)

Solution 10 - Algorithm

A binary search works by dividing the problem in half repeatedly, something like this (details omitted):

Example looking for 3 in [4,1,3,8,5]

  1. Order your list of items [1,3,4,5,8]
  2. Look at the middle item (4),
  • If it is what you are looking for, stop
  • If it is greater, look at the first half
  • If it is less, look at the second half
  1. Repeat step 2 with the new list [1, 3], find 3 and stop

It is a bi-nary search when you divide the problem in 2.

The search only requires log2(n) steps to find the correct value.

I would recommend Introduction to Algorithms if you want to learn about algorithmic complexity.

Solution 11 - Algorithm

Since we cut down a list in to half every time therefore we just need to know in how many steps we get 1 as we go on dividing a list by two. In the under given calculation x denotes the numbers of time we divide a list until we get one element(In Worst Case).

1 = N/2x

2x = N

Taking log2

log2(2x) = log2(N)

x*log2(2) = log2(N)

x = log2(N)

Solution 12 - Algorithm

T(N) = T(N/2) + 1

T(N) = T(N/2) + 1 = (T(N/4) + 1)+ 1

...

T(N) = T(N/N) + (1 + 1 + 1 +... + 1) = 1 + logN (base 2 log) = 1 + logN

So the time complexity of binary search is O(logN)

Solution 13 - Algorithm

ok see this
for(i=0;i<n;n=n/2)
{
i++;
}
1. Suppose at i=k the loop terminate. i.e. the loop execute k times.

2. at each iteration n is divided by half.

2.a n=n/2                   .... AT I=1
2.b n=(n/2)/2=n/(2^2)
2.c n=((n/2)/2)/2=n/(2^3)....... aT I=3
2.d n=(((n/2)/2)/2)/2=n/(2^4)

So at i=k , n=1 which is obtain by dividing n  2^k times
n=2^k
1=n/2^k 
k=log(N)  //base 2
     

Solution 14 - Algorithm

Let me make it easy for all of you with an example.

For simplicity purpose, let's assume there are 32 elements in an array in the sorted order out of which we are searching for an element using binary search.

1 2 3 4 5 6 ... 32

Assume we are searching for 32. after the first iteration, we will be left with

17 18 19 20 .... 32

after the second iteration, we will be left with

25 26 27 28 .... 32

after the third iteration, we will be left with

29 30 31 32

after the fourth iteration, we will be left with

31 32

In the fifth iteration, we will find the value 32.

So, If we convert this into a mathematical equation, we will get

(32 X (1/25)) = 1

=> n X (2-k) = 1

=> (2k) = n

=> k log22 = log2n

=> k = log2n

Hence the proof.

Solution 15 - Algorithm

we can establish a recurrence relation T(n)=T(n/2)+1

T(n/2)= T(n/4)+1 therefore T(n)= T(n/4)+2

T(n/4)= T(n/8)+1 therefore T(n)= T(n/8)+3

we can establish a relation

T(n) = T(n/(2^k))+k ----------lets call this as equation 1

and we keep on expanding, we will reach a point where

2^k = n, applying log on both sides

log(2^k) = log(n)

therefore k = log(n)
putting this value of k in equation 1

T(n) = T(n/(2^(log(n)))) + log(n)

therefore T(n) = T(n/n) + log(n) therefore T(n) = T(1) + log(n)

hence T(n) = log(n)

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionBunny RabbitView Question on Stackoverflow
Solution 1 - Algorithmduedl0rView Answer on Stackoverflow
Solution 2 - AlgorithmabstractKarshitView Answer on Stackoverflow
Solution 3 - AlgorithmDhiren BirenView Answer on Stackoverflow
Solution 4 - AlgorithmMichael DorganView Answer on Stackoverflow
Solution 5 - AlgorithmvidstigeView Answer on Stackoverflow
Solution 6 - AlgorithmSirPhemmieyView Answer on Stackoverflow
Solution 7 - AlgorithmJonathan MView Answer on Stackoverflow
Solution 8 - AlgorithmRajKonView Answer on Stackoverflow
Solution 9 - Algorithmid01View Answer on Stackoverflow
Solution 10 - AlgorithmSilas ParkerView Answer on Stackoverflow
Solution 11 - AlgorithmAbdul MalikView Answer on Stackoverflow
Solution 12 - AlgorithmTizeeU0UView Answer on Stackoverflow
Solution 13 - AlgorithmPiyush JainView Answer on Stackoverflow
Solution 14 - AlgorithmSumukha H SView Answer on Stackoverflow
Solution 15 - Algorithmnikhil kekanView Answer on Stackoverflow