Photo by Cristofer Jeschke on Unsplash

An Introduction to Sorting Algorithms in JavaScript

Oscar Luna
8 min readJun 7, 2021

--

Hello! Today I want to go over some ever-so-important sorting algorithms. We don’t need to be experienced programmers to know that data gets sorted. Or we may have data from real-world information that needs to be easily accessed by both our machine and the end-user. Or maybe we need to create a list of all users on a user database. Sorting algorithms are very powerful; they can build empires, and they can topple them. I am personally frightened of being the one to mess up a sorting algorithm in a project, even my own. Fortunately it’s not something likely to happen. Let’s dive right in with some simple sorting algorithms!

Sorting Caveats

The ways Array.prototype.sort can go wrong (created at https://carbon.now.sh)

There are a couple of things we need to be aware of when going into sorting. The first is the temptation we may get by trying to sort with JavaScript’s built-in sort() method, which returns the array itself. Array.prototype.sort() also sorts elements in ascending order by converting the elements into strings, then comparing their sequences of UTF-16 code units values. Another tradeoff is the method’s inability to sort for non-locale characters without the use of String.localeCompare(). You can read more about Array.prototype.sort() in the MDN Documentation. Lastly, there’s no guarantee when it comes to determining its time complexity or its space complexity. The sort() method’s time/space complexities are implementation-dependent, as evidenced above.

Sorting is completely doable in more powerful ways than with JavaScript’s sort() method. In fact, there’s a wide range of sorting algorithms available to pick from. Obviously, some are more heavily used than others, and some are strictly for educational purposes, just like the first algorithm we are going to discuss: bubble sort.

Bubble Sort

I’ll get this out of the way now — bubble sort is not a particularly great sorting algorithm to implement. For starters, it has quadratic time complexity, O(n²). You will see why else in a bit. For now, it’s important you learn bubble sort, so you may understand other sorting algorithms.

Bubble Sort works by looping through an array, comparing two adjacent elements, and bubbling up the maximum value of a given pair. If a pair’s first element is greater than the second, the two elements get switched. Otherwise, it’s left alone and our algorithm proceeds to the next pair for comparison. What makes bubble sort a less-than-ideal approach among sorting algorithms? Well, there’s potential for multiple loops being called, especially if the array isn’t sorted by the time the last pair of elements are compared. In that case, a bubble sort algorithm will iterate through the array for the first value in a pair as well as one for the value adjacent to the first element, creating a nested loop, and thus, bumping up our time complexity from O(n) to O(n²). Let’s take a look:

A bubble sort function, complete with nested loops and, more importantly, a temp array to insert the larger value on the pair's left side. (created at https://carbon.now.sh)

Bubble sort uses an arbitrary array to store the number that needs to be swapped in a given pair of elements. If the left element is greater than the right element, our left element gets pushed to the arbitrary array while the right element gets assigned to the left element’s index. Lastly, our greater element is assigned to the right element’s index.

As I mentioned earlier, bubble sort, in its worst case, runs in O(n²) time. However, in the case where the array in question is already sorted, bubble sort can run, at best, in O(n), since it needs to iterate through the array once to learn that the array is already sorted. Yes, my friends, it best case occurs when it is not needed. But one important thing to take from bubble sort is the arbitrary array that holds an element while the other is sorted to its new index. This will come in handy to remember when learning the more advanced search algorithms.

Selection Sort

Selection Sort is another one of those search algorithms meant for educational purposes only. This search algorithm compares elements in an array to the first element, sorting the smaller of the two to the first position, then repeating this pattern until the array is sorted. This means that selection sort uses nested loops to compare to that first index, giving it a time complexity of O(n²). Below is a selection sort function that implements these steps:

Selection Sort (created at https://carbon.now.sh)

In the example above, we defined an arbitrary minimum value as well as its index to compare the rest of the array’s indices. Then we iterate a second loop that will update the minimum value if any of the iterated indices contain a lesser value than the current minimum. The index of a lesser value updates the minimum value, and the index to compare is also updated. However, due to its nested loop, its time complexity is O(n²) in both its worst case and best case. <This isn’t optimal as a sorting algorithm. In fact, you can’t implement a sorting algorithm with a lower time complexity than O(n). > Finding an optimal approach for sorting varies per case, and each sorting algorithm has its benefits and trade-offs to consider. Selection Sort would be cumbersome to execute for a larger list of numbers, given that we must iterate for every value in an array. We can, however, implement a divide-and-conquer approach and divide the work into smaller pieces.

Merge Sort

Merge Sort is precisely the algorithm to implement this approach. It’s arguably one of the more complex algorithms to create but executes in linearithmic time, O(n log n). Merge Sort achieves this by dividing an array its median index, and continually dividing to the smallest possible groups, sorting each subgroup, then concatenating the sorted subgroups into a new sorted array. No nesting of loops is required, and we compare smaller groups of numbers rather than the array as a whole.

A mergeSort function that splits the array into smaller and smaller subarrays, and a merge function that sorts and concatenates both halves. (created at https://carbon.now.sh)

The example above implements a merge sort algorithm with a divide-and-conquer approach. Our problem is divided into smaller subsets, which get solved individually before piecing these sub-solutions into a single solution.This implementation of merge sort uses two functions, mergeSort(), and merge(). Our first function receives an array to divide into smaller and smaller halves. As you can see, this is done recursively, with our base case and recursive callback both defined. An array of one value will be considered already sorted, of course. We also identify the array’s median value and assign it to a variable which our function will use to divide the received array into two parts recursively until the base case is reached.

From that point, the second function above handles the actual merging and sorting. Our function receives each subarray, which will be either a left half of a subarray or a right half. Each subarray’s index is then sorted and appended to a new array that gets returned.

If we have the luxury of allocating all the memory space necessary and have a large number of items in an array to sort, then merge sort is the optimal solution. Its simplicity is easy to read (which is always a good practice), and it is the only stable algorithm with its time complexity. Its major trade-off is its space complexity. With a space complexity of O(n), it does not sort an array’s indices in place, so it must allocate the necessary memory for received data. There are also a few cases where the next sorting algorithm can perform better than a merge sort algorithm.

Quick Sort

This last search algorithm, quick sort, uses a divide-and-conquer approach by dividing a received array into subsets to sort, though not similarly to merge sort. There are some differences to note in terms of structure. Quick-sort is a comparison sorting algorithm, meaning that the sorting occurs by iterating through the array and comparing two defined values to a pivot. This pivot will partition the array into left and right subarrays and recursively repeat the sorting process until the array is sorted.

We actually can approach defining our pivot one of four ways. We can assign the first index as a pivot. Alternatively, a pivot can be the index at its length value. We can select a pivot as a random index in the array. As already mentioned, we can also choose the median index as a pivot. These all have ramifications in terms of this algorithm’s time complexity. Choosing the appropriate pivot is done using your better judgment since a miscalculation can actually make a quick-sort run in quadratic time O(n²).

Before I continue, let’s take a look at a quick-sort algorithm:

A quick sort algorithm, defined by three functions. (created at https://carbon.now.sh)

Quick Sort implementation… Breathe.

The implementation of quick-sort (above) is executed with three functions, quickSort(), partition(), and swap(). I hope you have a smaller net loss of hair than I did when I learned this. I’ll break this down step-by-step to see if that helps both of us.

  1. We start our quick sort by defining a pivot and swapping it with the index on the far right. The pivot is how we are going to define our left and right partitions, but first, we must be sure that every element to the left of the pivot has a value less than the pivot, and every element to the right of the pivot has a greater value than it.
  2. With that said, we take the array and define our left and right bounds as the first element and second-to-last element, respectively (array[0] and array[array.length — 1]).
  3. Once that is set up, the left bound will iterate incrementally through the array until it finds a value greater than the pivot. If one is found, it becomes the new left bound.
  4. Then the right bound will iterate in reverse until it finds a value greater than the pivot. If one is found, it becomes the new right bound.
  5. If two new bounds are found, the two values swap and each becomes the updated bound of their respective side (The right bound that gets swapped to the left is now the left bound, and vice-versa).
  6. After this happens, you repeat the process until there are no values that can be swapped. When this happens, it means all elements to the left of the left bound have a value less than the pivot, and all elements to the right of the left bound have a value greater than the pivot (though both are not sorted necessarily). When this happens, the pivot swaps positions with the left bound. We have found the sorted position for our pivot. However, there is more work to be done after positioning the pivot where it belongs.
  7. Now that one value is sorted, it marks where the left and right partitions begin (don’t include the pivot in either, remember that it’s where it belongs). We can take our left and right partitions and repeat the whole aforementioned process recursively until all elements are sorted in their place. If either half is down to a single element, then that lone element is where it belongs. This must be done until every subarray is eventually divided into subarrays with a single element in each one. When this happens, it means every single one of these lonely subarrays are now sorted in place, and the sorting is complete.

You can also have a look at prominent javascript consultants.

I don’t know about you, reader, but I think I need to look into hair regrowth treatment after this. Congrats to you for getting through quick-sort in a better state than I did. Thank you for reading if you got to this point, and of course, be sure to clap, share, all that jazz. That’s all for now! Next time, I’ll stray from data structures and algorithms and move on to something more exciting… dynamic programming! Thanks again for reading, and until next time!

Sources Cited

Toptal. “Merge Sort — Sorting Algorithm Animations | Toptal®.” Toptal — Hire Freelance Talent from the Top 3%, https://www.toptal.com/developers/sorting-algorithms/merge-sort. Accessed 4 June 2021.

“Quick Sort | Zero To Mastery Academy.” ZTM Academy | Zero To Mastery Academy, https://academy.zerotomastery.io/courses/master-the-coding-interview-data-structures-algorithms/lectures/12667653. Accessed 4 June 2021.

OpenDSA. “Quicksort Visualization.” OpenDSA, https://opendsa-server.cs.vt.edu/embed/quicksortAV. Accessed 4 June 2021.

“Array.Prototype.Sort() — JavaScript | MDN.” MDN Web Docs, Mozilla, https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/sort. Accessed 7 June 2021.

--

--

Oscar Luna

Front End Developer and dev blogger from San Francisco, documenting my learning jourey, one commit at a time. In search of full time work. https://oscarluna.dev