The Top Essential Algorithms for Developers, including video lessons and code examples

UI Engineer
14 min readApr 24, 2023

--

Whether you’re a beginner just starting out in the world of tech or an experienced developer looking to expand your skills, understanding key algorithms is essential for success. In this post, we’ll explore some of the most important algorithms every developer should know, including Binary Search, Bubble Sort, Merge Sort, Quick Sort, Depth-First Search, Breadth-First Search, Dijkstra’s Algorithm, Knapsack Problem, and Traveling Salesman Problem. I’ll provide video lessons and code examples to help you understand these algorithms in depth and apply them to your own projects. So, let’s get started!

👨‍💻 Binary Search

Binary search is an efficient algorithm for finding a particular item in a sorted list. It repeatedly divides the search interval in half until the target item is found or the interval is empty. Here’s an example implementation in Python:

def binary_search(arr, target):
left = 0
right = len(arr) - 1
length = len(arr)
while left <= right:
mid = (left + right) // 2
if arr[mid] == target:
return mid
elif arr[mid] < target:
left = mid + 1
else:
right = mid - 1
return -1

Here’s an example in JavaScript:

function binarySearch(arr, target) {
let left = 0;
let right = arr.length - 1;
const length = arr.length;
while (left <= right) {
let mid = Math.floor((left + right) / 2);
if (arr[mid] === target) {
return mid;
} else if (arr[mid] < target) {
left = mid + 1;
} else {
right = mid - 1;
}
}
return -1;
}

In the code, the length of the array is calculated once and stored in the variable length . This eliminates the need to access the .length property in each iteration of the loop.

👩‍💻 Bubble Sort

Bubble sort is a basic sorting algorithm that repeatedly swaps adjacent elements if they are in the wrong order. Here’s an example in Python:

def bubble_sort(arr):
n = len(arr)
for i in range(n):
swapped = False
for j in range(n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
swapped = True
if not swapped:
break
return arr

To make the bubble sort algorithm more efficient and reduce its time complexity, two optimizations can be made. First, a flag variable called “swapped” is added to keep track of whether any swaps were made during an iteration of the outer loop. If no swaps were made, then the flag remains False, and the loop is terminated early. Second, the size of the inner loop is reduced by i iterations in each iteration of the outer loop.

JavaScript:

function bubbleSort(arr) {
const n = arr.length;
for (let i = 0; i < n; i++) {
let swapped = false;
for (let j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
[arr[j], arr[j + 1]] = [arr[j + 1], arr[j]];
swapped = true;
}
}
if (!swapped) {
break;
}
}
return arr;
}

👨‍💻 Merge Sort

Merge sort is an efficient sorting algorithm. It divides the unsorted list into n sub-lists, each containing one element. Then, it repeatedly merges the sub-lists to produce new sorted sub-lists until only one sub-list remains. Here is an example implementation in Python:

def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = arr[:mid]
right = arr[mid:]
left = merge_sort(left)
right = merge_sort(right)
return merge(left, right)

def merge(left, right):
result = []
i = 0
j = 0
while i < len(left) and j < len(right):
if left[i] < right[j]:
result.append(left[i])
i += 1
else:
result.append(right[j])
j += 1
result += left[i:]
result += right[j:]
return result

This implementation uses recursion to divide the input array into smaller subarrays, each containing only one element. It then merges the subarrays back together by comparing elements and inserting them into a new array in sorted order.

The merge_sort function is the main entry point for the algorithm, while the merge function is a helper function used to merge the subarrays.

JS:

function mergeSort(arr) {
if (arr.length <= 1) {
return arr;
}
const mid = Math.floor(arr.length / 2);
const left = arr.slice(0, mid);
const right = arr.slice(mid);
const sortedLeft = mergeSort(left);
const sortedRight = mergeSort(right);
return merge(sortedLeft, sortedRight);
}

function merge(left, right) {
const result = new Array(left.length + right.length);
let i = 0;
let j = 0;
let k = 0;
while (i < left.length && j < right.length) {
if (left[i] < right[j]) {
result[k++] = left[i++];
} else {
result[k++] = right[j++];
}
}
while (i < left.length) {
result[k++] = left[i++];
}
while (j < right.length) {
result[k++] = right[j++];
}
return result;
}

In this implementation, we create a new array result with the size of left plus right, and we use a third index k to keep track of the current position in the result array. We then iterate over the left and right arrays with i and j, and compare their elements. We assign the smaller element to result[k], and increment k and the index of the array from which the element was taken. Finally, we copy any remaining elements from left or right into result.

👩‍💻 Quick Sort

Quick sort is an efficient sorting algorithm that works by partitioning the array around a pivot element and then recursively sorting the sub-arrays.

Python:

The function first checks if the length of the array is less than or equal to 1. If so, it returns the array as is, since there is no need to sort a single element or an empty array.

If the array has more than one element, the pivot is chosen as the first element. Then, two sub-arrays are created: one containing all elements less than the pivot, and the other containing all elements greater than or equal to the pivot. This is done using list comprehension.

The function then recursively calls itself on the two sub-arrays and concatenates the results with the pivot element in the middle to obtain the final sorted array.

def quick_sort(arr):
if len(arr) <= 1:
return arr
pivot = arr[0]
left = [x for x in arr[1:] if x < pivot]
right = [x for x in arr[1:] if x >= pivot]
return quick_sort(left) + [pivot] + quick_sort(right)

JavaScript:

function quickSort(arr) {
if (arr.length <= 1) {
return arr;
}
const pivot = arr[0];
const left = arr.slice(1).filter(x => x < pivot);
const right = arr.slice(1).filter(x => x >= pivot);
return [...quickSort(left), pivot, ...quickSort(right)];
}

The function quickSort takes an array arr as input and sorts it in ascending order using the Quick Sort algorithm.

👨‍💻 Depth-First Search

Depth-first search is an algorithm used for traversing or searching tree or graph data structures. It begins at the root (or another arbitrarily selected node) and explores as far as possible along each branch before backtracking.

def dfs(graph, node, visited=None):
if visited is None:
visited = set()
visited.add(node)
for neighbor in graph[node]:
if neighbor not in visited:
dfs(graph, neighbor, visited)
return visited

The function takes in a graph represented as a dictionary of adjacency lists, a starting node, and an optional set of visited nodes. It performs a recursive depth-first search starting from the given node, marking visited nodes along the way, and returns the set of visited nodes.

function dfs(graph, node, visited = new Set()) {
visited.add(node);
for (const neighbor of graph[node]) {
if (!visited.has(neighbor)) {
dfs(graph, neighbor, visited);
}
}
return visited;
}

The function takes in a graph object, a node to start the search from, and a visited set that keeps track of the nodes that have already been visited.

The algorithm starts by adding the node to the visited set. Then, for each neighbor of the node in the graph, it recursively calls the dfs function with that neighbor as the new node. The base case for the recursion is when all neighbors of the current node have already been visited.

The function returns the visited set at the end, which contains all the nodes that were visited during the search.

👩‍💻 Breadth-First Search

Breadth-first search is an algorithm for traversing or searching tree or graph data structures. It begins by starting at the root (or another arbitrarily selected node) and exploring all the neighboring nodes at the current depth before moving on to the nodes at the next depth.

Python code:

def bfs(graph, start):
visited = set()
queue = [start]
while queue:
node = queue.pop(0)
visited.add(node)
for neighbor in graph[node]:
if neighbor not in visited:
queue.append(neighbor)
return visited

The function bfs takes in a graph and a starting node as arguments. It initializes an empty set called visited and a queue with the starting node. While the queue is not empty, the first node in the queue is dequeued and added to the visited set. Then, for each neighbor of the dequeued node in the graph, if the neighbor has not been visited before, it is added to the end of the queue. This process continues until the queue is empty, and the function returns the set of visited nodes.

The same code translated to JavaScript:

function bfs(graph, start) {
const visited = new Set();
const queue = [start];
while (queue.length > 0) {
const node = queue.shift();
visited.add(node);
for (const neighbor of graph[node]) {
if (!visited.has(neighbor)) {
queue.push(neighbor);
}
}
}
return visited;
}

👨‍💻 Dijkstra’s Algorithm

Dijkstra’s algorithm is a well-known graph algorithm that is used to determine the shortest path between two nodes. It is popularly used in many applications such as GPS navigation systems, network routing protocols, and transportation planning. The algorithm works by maintaining a set of visited nodes and a priority queue of tentative distances to each node.

In detail, the algorithm begins by assigning a tentative distance to each node in the graph. It then marks the starting node as visited and proceeds to visit the node with the smallest tentative distance. From this node, the algorithm examines all of the neighboring nodes and calculates their tentative distances. If the newly calculated tentative distance is less than the current tentative distance for a neighboring node, the current tentative distance is updated. The algorithm then marks the node with the smallest tentative distance as visited and repeats the process until the destination node is reached.

The algorithm can be further optimized by using a data structure such as a binary heap or Fibonacci heap to store the priority queue of tentative distances. This allows for faster retrieval of the node with the smallest tentative distance and reduces the time complexity of the algorithm.

Python:

import heapq

def dijkstra(graph, start):
distances = {node: float('inf') for node in graph}
distances[start] = 0
queue = [(0, start)]
while queue:
current_distance, current_node = heapq.heappop(queue)
if current_distance > distances[current_node]:
continue
for neighbor, distance in graph[current_node].items():
tentative_distance = current_distance + distance
if tentative_distance < distances[neighbor]:
distances[neighbor] = tentative_distance
heapq.heappush(queue, (tentative_distance, neighbor))
return distances

The graph parameter is a dictionary where each key is a node in the graph and each value is another dictionary representing its neighbors and their distances. For example, graph = {'A': {'B': 2, 'C': 5}, 'B': {'C': 1, 'D': 3}, 'C': {'D': 1}, 'D': {'A': 1}} represents a graph with nodes A, B, C, and D and edges A-B with weight 2, A-C with weight 5, B-C with weight 1, B-D with weight 3, C-D with weight 1, and D-A with weight 1.

The algorithm initializes all distances to infinity except for the start node, which is set to 0. It then adds the start node to the priority queue with a distance of 0. The loop runs until the queue is empty, and in each iteration, it pops the node with the smallest distance from the queue. If the distance to the popped node is greater than the distance stored in the distances dictionary, the algorithm skips processing that node, as it has already been processed. Otherwise, it updates the distances of all neighbors of the popped node and adds them to the priority queue.

JavaScript:

function dijkstra(graph, start) {
const distances = {};
for (const node in graph) {
distances[node] = Infinity;
}
distances[start] = 0;
const queue = [[0, start]];
while (queue.length > 0) {
const [currentDistance, currentNode] = queue.shift();
if (currentDistance > distances[currentNode]) {
continue;
}
for (const [neighbor, distance] of Object.entries(graph[currentNode])) {
const tentativeDistance = currentDistance + distance;
if (tentativeDistance < distances[neighbor]) {
distances[neighbor] = tentativeDistance;
queue.push([tentativeDistance, neighbor]);
}
}
}
return distances;
}
  • The function dijkstra takes in a graph object and a start node as input.
  • First, the function initializes the distances object with all nodes in the graph having a distance of infinity except for the start node which has a distance of 0.
  • The queue array is also initialized with an array containing the start node and its distance from itself, i.e., [0, start].
  • The main loop of the algorithm runs as long as there are nodes in the queue.
  • At each iteration of the loop, the node with the shortest distance from the start node is dequeued from the queue.
  • For each neighbor of the dequeued node, the algorithm computes the tentative distance to that neighbor and updates the distances object and queue accordingly.
  • The algorithm continues until all nodes have been visited and the distances object contains the shortest distances from the start node to all other nodes in the graph.
  • Finally, the function returns the distances object.

👩‍💻 Knapsack Problem

The knapsack problem is a classic optimization problem that involves packing a set of items into a knapsack of limited capacity to maximize the total value of the items. Here is an example implementation of the dynamic programming solution in Python:

def knapsack(values, weights, capacity):
n = len(values)
dp = [[0 for _ in range(capacity+1)] for _ in range(n+1)]

for i in range(1, n+1):
for j in range(1, capacity+1):
if weights[i-1] <= j:
dp[i][j] = max(dp[i-1][j], values[i-1] + dp[i-1][j-weights[i-1]])
else:
dp[i][j] = dp[i-1][j]

return dp[n][capacity]

This function takes in three parameters: values, weights, and capacity. The values and weights arrays contain the values and weights of n items, respectively. The capacity parameter represents the maximum capacity of the knapsack.

The function implements dynamic programming to find the maximum value that can be obtained by filling the knapsack with items of different values and weights, without exceeding the capacity of the knapsack. The time complexity of this algorithm is O(nW), where n is the number of items and W is the maximum capacity of the knapsack.

JavaScript:

function knapsack(weights, values, capacity) {
const n = weights.length;
const dp = new Array(n + 1).fill(null).map(() => new Array(capacity + 1).fill(0));

for (let i = 1; i <= n; i++) {
for (let j = 1; j <= capacity; j++) {
if (weights[i - 1] > j) {
dp[i][j] = dp[i - 1][j];
} else {
dp[i][j] = Math.max(dp[i - 1][j], values[i - 1] + dp[i - 1][j - weights[i - 1]]);
}
}
}

return dp[n][capacity];
}

// Example usage:
const weights = [2, 3, 4, 5];
const values = [3, 4, 5, 6];
const capacity = 8;
const result = knapsack(weights, values, capacity);
console.log(result); // Output: 9

In this example, the knapsack function takes in an array of weights, an array of values, and a capacity value. It uses dynamic programming to determine the maximum value that can be obtained with the given capacity. The function returns this maximum value.

The function creates a 2D array dp with n+1 rows and capacity+1 columns, where n is the length of the weights and values arrays. Each cell of dp represents the maximum value that can be obtained with the given capacity and items up to that index. The base case is when there are no items or the capacity is 0, in which case the maximum value is 0.

The function then iterates through the items and the capacities, filling in the dp array according to the rules of the Knapsack problem. If the weight of the current item is greater than the current capacity, then the maximum value remains the same as the previous item. Otherwise, the maximum value is either the value of the previous item or the value of the current item plus the maximum value that can be obtained with the remaining capacity and items up to the previous index.

Finally, the function returns the maximum value that can be obtained with the given capacity and all the items. In the example usage, we provide some sample values for the weights, values, and capacity, and print the result to the console.

👨‍💻 Traveling Salesman Problem

The traveling salesman problem is a classic optimization problem that involves finding the shortest possible route that visits every node in a graph exactly once and returns to the starting node. It is a well-known NP-hard problem, which means that there is no known polynomial-time algorithm for solving it in the worst case. However, there are several approximate algorithms that can find good solutions in practice.

Python code:

import itertools

def tsp(graph):
n = len(graph)
nodes = list(range(n))
best_path = None
best_cost = float('inf')
for path in itertools.permutations(nodes):
cost = sum(graph[path[i]][path[(i + 1) % n]] for i in range(n))
if cost < best_cost:
best_path = path
best_cost = cost
return best_path, best_cost

The algorithm generates all possible permutations of the cities and calculates the total distance traveled in each permutation. It keeps track of the best path found so far and returns it at the end. The implementation uses the itertools library to generate permutations and assumes that the input graph is represented as a matrix of distances between each pair of cities.

JavaScript:

function tsp(graph) {
const n = graph.length;
const nodes = [...Array(n).keys()];
const visited = new Set([0]);
let current = 0;
let path = [0];
let cost = 0;
while (visited.size < n) {
let minDist = Infinity;
let next;
for (const neighbor of nodes) {
if (!visited.has(neighbor) && graph[current][neighbor] < minDist) {
minDist = graph[current][neighbor];
next = neighbor;
}
}
visited.add(next);
path.push(next);
cost += minDist;
current = next;
}
path.push(0);
cost += graph[current][0];
return [path, cost];
}

The function takes in a graph represented as a 2D array where graph[i][j] is the distance between node i and node j. The function returns an array with two elements: the first element is an array representing the optimal path, and the second element is the total cost of the path.

The algorithm starts at node 0 and repeatedly visits the unvisited node with the shortest distance to the current node. It keeps track of the visited nodes, the current node, the path, and the cost. Once all nodes have been visited, the algorithm returns the path and cost, including the return trip to node 0.

Note that this implementation does not necessarily find the optimal solution, as the Travelling Salesman Problem is NP-hard and no efficient algorithm is known to solve it for large inputs.

Learning algorithms is an important skill for any Software Developer. Fortunately, there are many excellent online resources to help you learn and practice algorithms. Below are some helpful YouTube channels:

👨‍💻 CS50

This channel has a large collection of algorithm and data structure tutorials, including topics such as dynamic programming, graph algorithms, and binary search.

👩‍💻 MIT OpenCourseWare

MIT has made many of its computer science courses available online, including courses on algorithms and data structures.

🔥 freeCodeCamp

freeCodeCamp is a non-profit organization that provides free online courses in web development and other programming topics. Their mission is to help people learn to code and prepare for a career in tech, while also contributing to open source projects that benefit the community.

🙌 Conclusion

In this post, we have covered the top algorithms that every developer should know. These algorithms cover a wide range of topics, from sorting and searching to graph traversal and optimization. By mastering these algorithms, you will have a solid foundation for solving a variety of problems in your front-end development work.

With code examples provided in both Python and JavaScript, you should be able to implement these algorithms in your preferred programming language.

Happy coding!

--

--