Which Sorting Algorithms to Know for the Tech Interview

Mera Stackhouse
8 min readApr 25, 2019

--

There are many factors considered when applying for a job, but you should definitely have a strong understanding of sorting algorithms to give yourself the best chance of acing the dreaded technical interview. There are a ton of sorting algorithms in the world which could take you forever to memorize, but you don’t need to know them all.

There are a few key elements to each algorithm:

  • conceptually how it works
  • code implementation
  • time complexity
  • space complexity
  • stability of results

Time Complexity

We already know how to understand sorting, we do it all the time in real world situations. And we have a strong enough background in code now that we can even implement some basic algorithms. But what we don’t know about yet is time complexity. Although it isn’t an aspect of full stack development per se, it is an important aspect of choosing how to design your back-end, and very important for the tech interview.

Time complexity is concerned with run time of the algorithm, or how long it takes to sort the input and produce an output

The notation for time complexity starts with big O, which stand for order (of). There are also big Ω, big Θ, small o, and small ω, which all have meanings for slightly different aspects of the run time. The important take away is that they are talking about how quick an algorithm can run at different input sizes.

Big O is usually synonymous with big Θ in casual conversation because Θ represents the average run time of an algorithm. However, mathematically the big O refers to the worst case scenario run time. Sometimes, the average and worst case are the same, leading to more overlap in naming conventions.

GeeksforGeeks

It is important to know how to talk about each algorithm in terms of its specific time complexity. Each algorithm can be classed into one of the following categories.

A logarithmic algorithm — O(logn)
Runtime grows logarithmically in proportion to n.

A linear algorithm — O(n)
Runtime grows directly in proportion to n.

A superlinear algorithm — O(nlogn)
Runtime grows in proportion to n.

A polynomial algorithm — O(nc)
Runtime grows quicker than previous all based on n.

A exponential algorithm — O(cn)
Runtime grows even faster than polynomial algorithm based on n.

A factorial algorithm — O(n!)
Runtime grows the fastest and becomes quickly unusable for even
small values of n. GeeksforGeeks.com

In order to speak about an algorithm, we need to know how it behaves with different inputs. We may choose an algorithm because it is fast for small arrays, or fast for already sorted arrays, maybe it’s good for incredibly large array, or works with specific input attributes. All of these aspects would lead us to choose some algorithms over others. During an interview you should ask questions about the input of your algorithm and choose based on those factors.

The Algorithms

GeeksforGeeks

There are a bunch of different sorting algorithms and even more algorithms with other functions such as searching!

Quick Sort

Concept: Using a pivot point element, partition the array into two sections where everything below the pivot point is smaller and everything above the pivot is larger. The pivot point then moves through the values on the left putting them in order and the then values on the right. This a ‘divide and conquer’ algorithm since it involves making the larger list into smaller lists which are easier to sort.

Code: Learn how to implement this code in your language! The implementation will involve a partition function where you split the list into sub-parts and recursively sort each part. It requires you to pick where the initial pivot point is and how the pivot moves through the list.

Time complexity: Average case is O(nlogn).

Note: There are plenty of edge cases where it is less successful, but it is usually considered the all around best if you had to guess what to use. You can avoid most worst case scenarios by choosing the pivot wisely (the closest to the median possible). You can also mention 3-way quick sort/Dutch National Flag as an alternative to quick sort for repetitive inputs.

Fun Fact: Ruby uses quick sort for its .sort method.

Merge Sort

Wikipedia

Concept: Split arrays in half to create already sorted single element arrays and merge sorted arrays back together putting everything in the proper order

Code: Learn how to implement this code in your language! The implementation utilizes a merge function called again and again to reassemble all elements incorporating them in the correct position.

Time complexity: All three cases are O(nlogn).

Note: If two arrays are already sorted, it is especially easy to merge them. This algorithm takes apart an unsorted array to take advantage of this fact. Also worth mentioning is that the space complexity of merge sort is good at O(n), so if you come across an interviewer who asks you to prioritize space, this would be a good sort to pick.

Insertion Sort

Concept: Iterate over list and place each element in its correct place as you get to it from first to last.

Code: A while loop inside a for loop can create a process that keeps track of the indices where you can insert the latest element in the correct spot.

Time complexity: Average is O(n²) and best is O(n)

Note: Best to use when the list is small and/or almost totally sorted. It takes a long time but it does have some pluses in stability and space. Be sure to mention this as an alternative to the above two sorts if the situation calls for stability/space.

Heap Sort

GeeksforGeeks

Concept: Using a tree based data structure for storing elements in a sorted fashion.

Code: The implementation uses heapify to make the list into a heap data structure (which is possible to do in array form — see below). And then the second part is manipulating the elements by taking out the largest (or smallest) elements in order using the information stored in the data structure.

Wikipedia

Time complexity: All cases are O(nlogn).

Note: This is not used for efficiency, because quick and merge are much better in almost all aspects. However, the heap data structure is important to know and understand for lots of different applications and it might be worth demonstrating that you know about it because it allows for the worst case scenario to maintain its efficiency. It utilized the information stored in the binary tree rather than the information in the list itself to do the sorting, which makes it quicker in edge cases.

Bucket Sort, Radix Sort, and Counting Sort

Concept: Bucket creates ‘buckets’ of elements and sorts each. Radix uses bits to sort by digits, starting with first or last digit. And counting sorts by keeping track of the amount of times each value appears in the input list.

Code: You can look up the implementation of each!

Time complexity: These guys are super fast! Average and best is O(n+k) for bucket and counting and O(nk) for radix (where k is either the upper end of the range of inputs or number of bits).

Note: The inputs need to be very specific for these functions to work. Counting sort should work for a very small list. Bucket is best when the ‘buckets’ of partitioned elements are evenly distributed over the range and the buckets can be balanced. And radix works best if inputs have a certain number of bits per digit over a certain range of inputs.

Bubble Sort and Selection Sort

Conceptually: Bubble goes through list comparing adjacent element pairs and swaps pairs based on value. Selection goes through the list taking out the minimum element and, starting at the beginning, placing them in order.

Code: Iterate many times over the list incrementing indexes and switching indexes as needed.

Time complexity: Too long! Average is O(n²) and selection sort’s best case is even O(n²) as well.

Note: These guys are not efficient and they don’t really have anything going for them to merit their use. You should be familiar with them if you need to list out algorithms, but don’t choose them as the one to implement for the interview and be able to tell the interviewer why.

Cheatsheets

Find a cheatsheet to help yourself memorize the basic attributes of each algorithm. There are a ton all over the internet, just give it a quick google. You will also need to know about search algorithms, graph algorithms, string algorithms, random algorithms (ex: swapping without extra variables), bit-manipulation, and the efficiency of operations on each kind of data structure and when to use it. See sources for more specific suggestions!

Click here for bigger image

Key Learning Points

  • Understand and implement quick and merge sort
  • Understand the use cases of insertion, bucket, counting, radix sort
  • Learn how heap sort works with the tree data structure
  • Know why you shouldn’t use bubble sort or selection sort
  • You can throw in an allusion to tim sort if you want to sound fancy (or if you use python)
  • There’s a bunch of other stuff to learn beside sorting algorithms

Sources

--

--