Big O Notation and Sorting Algorithms

Raul Aguilera
Sep 7, 2018 · 5 min read

The purpose of this blog is to give an introduction on how Big O notation helps us differentiate between the time complexity of algorithms and determine which can be faster than others depending on input size. Also includes the implementation of three popular sorting algorithms and an analysis to determine their complexity using Big O notation.

There’s a repository where you can find the algorithm implementations discussed in this article. https://github.com/RaulAguilera/fuzzy-guide.git


Big O Notation

Big O Notation tells you how fast an algorithm is, but it doesn’t tell you the speed in seconds. It tells you how fast the algorithm grows i.e. how quickly the run time of an algorithm increases as the size of the input increases. It’s very possible for O(n) code to run faster than O(1) code for specific inputs, that’s because Big O just describes the rate of increase.

Big O establishes a worst-case runtime. Suppose you are looking for a name on a phone book, using simple search it will take O(n) to look a name because in the worst case you’ll have to look into all the records. Even if the name you are looking for appears at the beginning of the list (the best-case) the time complexity of this algorithm will be O(n).

Some common big O run times are these:

O (log n), also known as log time. Example: Binary search

O (n), also known as linear time. Example: Simple search

O (n * log n). Example: Merge sort

O (n2). Example: Insertion sort

O (n!). Example: The traveling salesperson

This is a graphical representation of how Big O specifies the growing in time complexity of algorithms:


Sorting algorithms

A sorting algorithm is a series of instructions that takes an array as input, performs specified operations on the array and outputs a sorted array.

In the following section, I’ll explain the implementation of some common sorting algorithms and specify their time complexity using Big O notation.

Selection Sort

Selection sort is an inefficient algorithm on large lists. It has some advantages over other algorithms because its an ‘in place’ sorting and it’s characterized by its simplicity.

The way it works is basically dividing the array into two parts, sorted and unsorted. There are two loops which will ‘populate’ the sorted part by comparing each of the items of the unsorted part to determine which is the array index with the minor item. When the minor item of the unsorted part is determined it’s then swapped with the last position of the sorted part until the whole array is sorted.

Implementation of Selection Sort

Analysis of Selection Sort

For this algorithm, we see that the dominant operations are the nested for loops. They are on the way “do this each time you do that”. Taking on account that the input size is n and both loops are based on the input size, we can say that the time complexity for this algorithm will be O(n*n) most commonly written as O(n2). Read as Big O of n square.

Insertion Sort

This is an algorithm efficient for sorting a small number of elements.

The way it works is very similar to how we sort a hand of playing cards. We start with an empty left hand and the cards face down on the table. We then remove one card at a time from the table and insert it into the correct position of the left hand. To find the correct position for a card, we compare it with each of the cards already in the hand. At all times, the cards held in the left hand are sorted.

Implementation of Insertion Sort

Analysis of Insertion Sort

In this algorithm we see that we also have two nested loops; but there’s a difference, on the inner loop we do a comparison to check if we’ll perform a swap. This makes that in the best case (if the array is already sorted) we will only need to perform n operations. Anyways as Big O is based on the worst case scenario (the array is sorted in reverse mode) we’ll have to consider that we’ll do n*n operations i.e. make a swap for each element in the array. So the time complexity for Insertion Sort is O(n2).

Quick Sort

This algorithm uses a technique named recursion to sort an array. It’s a better algorithm than Insertion and Selection sort in most of the cases. I’m saying “most of the cases” because it’s cataloged as an “unstable” algorithm.

The way it works is by first determining a pivot (there are different approaches to determining a pivot but in this article, we’ll use the last element). Once a pivot is selected we will partition the array in two sub-arrays with elements bigger than pivot on one and lower on the other. Then we’ll repeat the same process on both resulting arrays recursively.

Implementation of Quick sort

Analysis of Quick Sort

In this algorithm, we see that we only have one for loop so we could be tempted to say that the complexity would be O(n) which is not correct. It only takes O(n) to determine the partitions of the array. We need to take on account also the recursive calls. In the best case, that would be the pivot partitions the array in half each time, at each step the problem size will be halved so we’ll have log n nested calls. We know that each recursive call takes O(n), then we can say that Quick Sort time complexity would be O(n log n). But the worst case (which is the one we’re interested in) would be that on each partition attempt all the elements are whether bigger or lower than the pivot so we’ll end up with n recursive calls which result in a complexity of O(n2).

Conclusion

Big O notation might be a difficult concept at first but once it “clicks” it becomes a lot easier. The only way to master this concept is to practice with analysis of different algorithms. You might be thinking “why do I need this?” and my answer would be that understanding Big O notation could help you determine the best algorithm to apply when you have several approaches to solve a coding situation.

References:

  1. Aditya Y. Bhargava, Grokking Algorithms
  2. Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, Clifford Stein. Introduction to Algorithms Third Edition.
  3. Gayle Laakmann Mcdowell, Cracking the coding interview
  4. https://brilliant.org/wiki/sorting-algorithms/
  5. http://bigocheatsheet.com/

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade