# Binary Search

## Algorithms for Everyone

If you’re a new to programming or a self-taught programmer, binary search might sound daunting or complex. After all, *binary* is the language that your computer speaks (ones and zeroes), right?

1101 1001 0001 1010 1010 0010 1100 01010 1000 0100

In this case, binary search refers to a “binary decision”, which means that every time we make a decision, we have two choices:

- One choice leads us to whatever we’re looking for.
- The other leads to a dead end.

The “trick” to binary search is knowing what question to ask.

In the game of “20 Questions”, players ask “Yes/No” questions to guess an item. The type of question that is asked has a significant outcome on the game!

**“Is it Taylor Swift?”**

The answer to this question lets you know that the item is (or isn’t) Taylor Swift, but it doesn’t significantly reduce the potential answers. If we always ask questions like this, it would take forever to find the answer.

**“Is it living?”**

The answer to this question lets you know that the item is either living or inanimate object. If the answer to this question is “No”, then we know that it isn’t Taylor Swift, but we also know that it isn’t a puppy or a flower. This question removes half of the potential answers!

Binary search is commonly used in solving more complex problems, so it’s worth learning how and why it works.

# How does binary search work?

Let’s imagine that you’re throwing an epic party. You’ve sent out the invites and everyone’s coming! Tonight’s the night and you’ve got the guest list in your hands. Everyone is sorted alphabetically by last name, but it’s still **100 pages long**!

*(I told you, this is going to be an epic party.)*

Kit Harington is the first to show up, so you start flipping through the pages to find his name.

Where do you start?

Well, you *could* start on the first page and look at every single name until you find him. Since the guest list is in alphabetical order, you’ll eventually find it. The only problem is that Mark Wahlberg is up next and he’s not going to enjoy waiting until you flip through the first 90 pages to get to the Ws.

**Another approach is to pick a page somewhere in the middle of the stack.**

“Harington” is in the first half of the alphabet, so take a look at the first name on the page. If it comes after “Harington”, you know that you only need to search the first half of the guest list.

You keep doing this until you find the name that you’re looking for:

- Pick a page somewhere in the middle.
- Decide whether to continue searching the first half or the second half.

This is exactly how binary search works.

- Given a sorted list, compare the item in the middle of the list to see if it’s what you’re looking for.
- If it’s not, check the item in the middle of the list to see if it is less than or greater than the item that you’re looking for.
- Each time you perform this check, you throw out half of the list because you know it’s in the
*other*half. - Repeat this until you’ve found what you’re looking for… or there’s nothing left and you know that it isn’t there.

When you’re designing an algorithm, it’s helpful to write down the steps like this because it lets you think about how to solve the problem without implementation details getting in the way.

# Why is binary search important?

Many other algorithms and data structures build on the idea of binary search. Most databases use some form of binary search and you can probably find it used in the code that searches for files on your computer.

The effectiveness of this algorithm lies in breaking down the problem into smaller parts. Since the input must be a sorted list, this algorithm uses the property of being sorted to divide the problem in half at every step.

This idea isn’t limited to computers either. If we have a daunting list of errands or tasks that need to be done, one way to increase your efficiency is by figuring out what you **don’t** have to do. You’re going to be more efficient if you can focus on tasks that produce value and ignore the tasks that don’t.

In binary search, when we compare the item in the middle of the list to the item that we’re looking for, what we’re actually doing is trying to figure what work we can remove from our to-do list.

Binary search is all about throwing out the work that doesn’t provide value, so that we can focus on the work that gives us more information.

# Implementation

The following implementation is written in Python because the language has a lot of built-in features that make it easy to read.

It’s also implemented using recursion. Although recursion can be a difficult topic for beginning programmers to understand, I think the recursive implementation underscores the behavior of the binary search algorithm (which is that if we don’t find what we’re looking for, we keep searching on smaller parts of the original list). If the concept of recursion is tough for you, Daniel King wrote a great post about learning to think with recursion.

## The Algorithm

In this algorithm, we check the item in the middle of list at every step. If that item is the one that we’re looking for, then we’re done!

Otherwise, we break up the list into two smaller lists (the first half & the second half), but now we have to decide which half to search. That decision is done by checking whether the item in the middle of the list comes *before* or *after* the item we’re looking for.

# Performance

*If you’re new to Big-O Notation, Justin Abrahms wrote a helpful post called “**Big-O notation explained by a self-taught programmer**”. If any of the following is confusing, feel free to take a break and read Justin’s article.*

When programmers talk about the performance of an algorithm, they’re usually concerned with the worst-case scenario.

To understand why the worst-case scenario is the most useful, let’s start by analyzing the runtime performance of the brute-force approach.

## Start on the first page of the guest list and read everyone’s name until we find the name that we’re looking for…

**In the best case**, the name that we’re looking for is the first name in the list. That takes no time at all, but what are the odds that you’ll only search for the first name in the list.**In the worst case**, we have to find the name of the last person on the guest list. That means flipping through 100 pages of names! What if there were 1,000 pages or 1,000,000 pages!

If there are **n** people on the list, then we would say that this algorithm’s performance is **O(n)**. This means that in the worst-case, you’ll have to look at every item.

The reason that programmers care about the worst-case scenario is because

it’s a guarantee that calling this function will never take longer than the worst-case.

**In the case of binary search**, we throw away half of the list every time we check to see whether we’ve found a match.

If there are 100 people to start with:

- The first check removes 50 names from the potential search space.
- The second check removes 25 names.
- The third check removes 13 names.
*(12 names + the name that you’re checking)* - The fourth check removes 7 names.
*(6 names + the name that you’re checking)* - The fifth check removes 4 names.
- The sixth check removes 2 names.
- The seventh check only has 1 name.

This means that binary search takes no more than 7 checks to search a list of 100 people. If you perform this exercise for every possible list, you would find that the graph looks a lot like a logarithmic curve. In Big-O notation, this is written as **O(log(n))**.

This graph shows that the brute force method *(of starting at the beginning of the list and reading every name) *doesn’t scale very well. For a list of 1,000,000 names, we eventually have to read 1,000,000 names.

The party would be over before you could find Billy Zane’s name!

On the other hand, using binary search on a list of 1,000,000 names, we can **always** find Billy Zane’s name in less than 20 tries. That’s the guarantee that Big-O notation provides. No matter how big the list becomes, the number of steps needed to find an item is *no worse* than **log(n)**.

To put the performance of binary search into context, let’s imagine that you’re a developer at Facebook, where they currently have more 1 billion users. Let’s say that you have to build a new feature that requires looking up a specific user name. You probably don’t want to use brute-force to search a billion users! If each try takes 1 millisecond, the brute-force approach would take over **270 hours**, but** **a binary search would return an answer in **30 milliseconds**. That’s a tremendous improvement and why binary search is a useful algorithm to understand.

*This is part of a series that I’m creating to teach the fundamentals of computer science and algorithms in a way that’s approachable to people without a computer science background.*