Growing with the Web

Binary search

Published
Tags:

Binary search is a decrease and conquer search algorithm than can be used on a sorted array. It operates by determining whether the search value is less than or greater than the middle value and recursively calling itself on the lower or upper half of the list respectively until either the value is found or not found.

The binary search algorithm is very similar to the binary search tree’s search operation though not identical. Binary search’s average and worst case time complexity is O(\log n), while binary search tree does have an average case of O(\log n), it has a worst case of O(n). Namely when the tree’s height equals the number of items in the tree (incredibly unlikely in any real scenario).

The real power in binary search shows itself when we use it to search a huge list of items, much like any logarithmic algorithm. Exponential functions work by looking at the whole input data when considering each item. Logarithms (inverse-exponential functions) work by repeatedly halving the input data. Consider a list that contains 1 million items, if this list happens to be sorted we can use binary search to search for an item in no more than 20 steps.

Values compared Possible candidates
0 1000000
1 500000
2 250000
3 125000
4 62500
5 31250
6 15625
7 7812
8 3906
9 1953
10 976
11 488
12 244
13 122
14 61
15 30
16 15
17 7
18 3
19 1

While maintaining a sorted list does take more effort, it may be worth it when we consider the incredible speed of the search. It all depends on what operation demands faster speed in your application.

Alternatively, we can get the same search speed with a smaller insert cost by using a balanced binary search tree, though there is quite a bit more to the implementation.

Complexity

Time Space
Worst case Best case Average case Worst case
O(\log n) O(1) O(\log n) O(1) auxiliary

Decrease and conquer

These types of algorithms have often been labelled as divide and conquer algorithms like merge sort. The name decrease and conquer has been proposed in order to differentiate any recursive algorithm from algorithms that halve a problem into two sub-problems which allowing for better parallelism.

Code

public class BinarySearch {
    public static boolean search(int[] sortedArray, int value) {
        return search(sortedArray, value, 0, sortedArray.length - 1);
    }
    
    private static boolean search(int[] sortedArray, int value, int min, int max) {
        if (max < min)
            return false;
        else {
            int mid = (min + max) / 2;

            if (sortedArray[mid] > value)
                return search(sortedArray, value, min, mid-1);
            else if (sortedArray[mid] < value)
                return search(sortedArray, value, mid+1, max);
            else
                return true;
        }
    }
}

Textbooks

Here are two CS textbooks I personally recommend; the Algorithm Design Manual (Steven S. Skiena) is a fantastic introduction to data structures and algorithms without getting to deep into the maths side of things, and Introduction to Algorithms (CLRS) which provides a much deeper, math heavy look.

 

Comments

comments powered by Disqus
Like this article?
Subscribe for more!