Python | Algorithms Cheat Sheet | Part 1 — Searching & Sorting

Jordan Parker
5 min readJun 25, 2020

--

This cheat sheet summarise key algorithms often tested in coding interview questions. It also provides code examples for a Python based coding exam. Hopefully, this article will be a useful tool in your preparation for that upcoming job interview or university exam. Please note that Python is a sub-optimal programming language for competitive programming.

If your coding exam requires you to write in Python, this cheat sheet will help!!!

This cheat sheet is broken down into the following sections:

  1. Searching & Sorting Algorithms
  2. Data Structures
  3. Graph Algorithms
  4. Dynamic Programming
  5. Number Theory

Part 1 will address Searching & Sorting Algorithms.

1) Searching & Sorting Algorithms

Searching and sorting algorithms below are discussed in reference to the array data structure. However, some sorting operations are optimal for linked lists. These scenarios have been highlighted.

Searching algorithms return the index of a matched value, whereas sorting algorithms arrange the elements of an array in their ascending or descending order. A sorted array is required for some searching algorithms.

A) Simple Sorting Algorithms

Selection Sort

A simple iteration over an array finding the index of the minimum/maximum value of the non-visited elements. The returned index is swapped with current index.

Un-visited indexes in blue and visited indexes in red.

Average Time Complexity: O(n²)

Insertion Sort

Partitions the array into sorted and unsorted sub-arrays. It selects the first element in the unsorted sub-array and shuffles the sorted elements left until the correct insertion index is found. The insertion index is found looking right to left from the sorted array.

This approach is an efficient solution for partially sorted arrays and small datasets. In practice, it is a more efficient sorting algorithm than other O(n²) algorithms (e.g. Selection Sort and Bubble Sort). Think of Insertion Sort as how people naturally sort a hand of cards.

Average Time Complexity: O(n²)

B) Complex Sorting Algorithms

Merge Sort

A Divide & Conquer algorithm that utilises a result cache and completes the following two step process:

  1. A merge() function that sorts two arrays by comparing the current value of each array (smallest/largest plays on) and storing the smallest/largest in an array cache.
  2. A merge_sort() function recursively divides an array into sub-arrays until the arrays are of length 1. Once divided, it traverses back through the recursion merging split arrays with the merge() function.

Merge Sort’s result cache increases the space complexity to O(n). The Divide & Conquer nature of Merge Sort see operations as highly parallel and therefore effective for distributed computing and large data sets.

Merge Sort is an optimal solution for sorting linked lists as linked lists allow for in-place insertion with time complexity O(1). This removes the result array overhead.

Average Time Complexity: O(nlog n)

Quick Sort

A Divide & Conquer algorithm that makes in-place swaps to build lower and higher sub-arrays around a pivot index. It completes the following process:

  1. It selects an index to represent the pivot (the right most index in the implementation below).
  2. It compares the value of the pivot to each element and swaps elements lower than the pivot to the highest index of the lower sub-array array.
  3. After looping through the array, the pivot is swapped to index just after the end of the lower sub-array. The array is now partitioned by the pivot into lower and higher sub-arrays.
  4. This process is repeated for both the lower and high sub-arrays.

The choice of the pivot point is crucial to the efficiency of Quick Sort. The closer the pivot point is to the median, the better. Worse case scenario is when the pivot point choice is the smallest or largest value: Time Complexity O(n²). The method implemented here assigns the last array item as the pivot, but many other methods exist to optimise pivot point selection.

Quick Sort is preferred over Merge Sort for arrays as it is an in-place solution. It does not require the result array and its time and memory overhead. However, as Quick Sort often requires random access of array elements, it is a sub-optimal solution for linked lists (random access in a linked list is not possible and traversal across the list is required instead).

Average Time Complexity: O(nlog n)

Visual Summary of Sorting Algorithm Speeds

C) Search Algorithms

Linear Search

A simple iteration through a sorted or unsorted array returning the index of the matched value. If the array is sorted, Binary Search presents a much faster alternative to Linear Search.

Average Time Complexity: O(n)

Binary Search

Splits a sorted array down the middle and checks the following cases:

  1. Does the middle index match? If so return the index.
  2. Is the searched value higher or lower than the value of the middle index? If lower or higher, repeat Binary Search on the lower and higher sub-array.

Binary Search is how people naturally search a encyclopedia. It is significantly faster than linear search. The recursion splits the search into a binary tree data structure.

Average Time Complexity: O(log n)

Exponential Search

An optimisation of Binary Search that finds the sub-array sliced from [0 : i] that minimises the number of splits in a Binary Search. Binary Search is then performed on the sub-array.

Exponential Search outperforms Binary Search when the matched value is closer to the beginning of the array. As worst case time complexity is O(log n), Exponential Search is often used in practice.

Average Time Complexity: O(log i)

Search and sort away…

Stay tuned for Part 2 — Data Structures

--

--

Jordan Parker
Jordan Parker

Written by Jordan Parker

A Data Scientist and Technology Consultant that enjoys building Neural Networks.