Every Sorting Algorithm You Must Know : Interview Preparation

Saquib Aftab
Javarevisited
Published in
6 min readFeb 25, 2024
Photo by Kelly Sikkema on Unsplash

There are a few Sorting Algorithms, that are required to be known if you are preparing for a Software Engineering Interview.

These algorithms are not asked directly but are used indirectly in a variety of questions asked

LET’S START

1. Bubble Sort

This is stable and in-place sorting algorithm. Average and Worst case time complexity is O(n2) and Best case complexity is O(n)

Time Complexity: O(N2)
Auxiliary Space: O(1)

Advantages of Bubble Sort

  • Bubble sort is easy to understand and implement.
  • It does not require any additional memory space.
  • It is a stable sorting algorithm, meaning that elements with the same key value maintain their relative order in the sorted output.

Disadvantages of Bubble Sort

  • Bubble sort has a time complexity of O(N2) which makes it very slow for large data sets.
  • Bubble sort is a comparison-based sorting algorithm, which means that it requires a comparison operator to determine the relative order of elements in the input data set. It can limit the efficiency of the algorithm in certain cases.

Algorithm

static void bubbleSort(int arr[], int n)
{
int i, j, temp;
boolean swapped;
for (i = 0; i < n - 1; i++) {
swapped = false;
for (j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {

// Swap arr[j] and arr[j+1]
temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
swapped = true;
}
}

// If no two elements were
// swapped by inner loop, then break
if (swapped == false)
break;
}
}

2. Selection Sort

This is in-place sorting algorithm as it does not use extra space. However this is not a stable sort algorithm. Time complexity is O(n2) in all cases.

Advantages

  • Simple and easy to understand.
  • Works well with small datasets.

Disadvantages

  • Selection sort has a time complexity of O(n²) in the worst and average case.
  • Does not work well on large datasets.
  • Does not preserve the relative order of items with equal keys which means it is not stable.

Algorithm

void selectionSort(int arr[])
{
int n = arr.length;

// One by one move boundary of unsorted subarray
for (int i = 0; i < n-1; i++)
{
// Find the minimum element in unsorted array
int min_idx = i;
for (int j = i+1; j < n; j++)
if (arr[j] < arr[min_idx])
min_idx = j;

// Swap the found minimum element with the first
// element
int temp = arr[min_idx];
arr[min_idx] = arr[i];
arr[i] = temp;
}
}

Insertion Sort

This is stable and in-place sorting algorithm. Average and Worst case time complexity is O(n2) and Best case complexity is O(n)

Time Complexity: O(N2)
Auxiliary Space: O(1)

Algorithm

void insertionSort(int arr[])
{
int n = arr.length;
for (int i = 1; i < n; ++i) {
int key = arr[i];
int j = i - 1;

/* Move elements of arr[0..i-1], that are
greater than key, to one position ahead
of their current position */
while (j >= 0 && arr[j] > key) {
arr[j + 1] = arr[j];
j = j - 1;
}
arr[j + 1] = key;
}
}

Merge Sort

Process of merge sort is to divide the array into two halves, sort each half, and then merge the sorted halves back together. This process is repeated until the entire array is sorted.

Time Complexity: O(N log(N))

Applications

  • Sorting large datasets: Merge sort is particularly well-suited for sorting large datasets due to its guaranteed worst-case time complexity of O(n log n).
  • External sorting: Merge sort is commonly used in external sorting, where the data to be sorted is too large to fit into memory.
  • Custom sorting: Merge sort can be adapted to handle different input distributions, such as partially sorted, nearly sorted, or completely unsorted data.

Advantages

  • Stability: Merge sort is a stable sorting algorithm, which means it maintains the relative order of equal elements in the input array.
  • Guaranteed worst-case performance: Merge sort has a worst-case time complexity of O(N logN), which means it performs well even on large datasets.
  • Parallelizable: Merge sort is a naturally parallelizable algorithm, which means it can be easily parallelized to take advantage of multiple processors or threads.

Disadvantages

  • Space complexity: Merge sort requires additional memory to store the merged sub-arrays during the sorting process.
  • Not in-place: Merge sort is not an in-place sorting algorithm, which means it requires additional memory to store the sorted data. This can be a disadvantage in applications where memory usage is a concern.
  • Not always optimal for small datasets: For small datasets, Merge sort has a higher time complexity than some other sorting algorithms, such as insertion sort. This can result in slower performance for very small datasets.

Algorithm

void merge_sort (int A[ ] , int start , int end )
{
if( start < end ) {
int mid = (start + end ) / 2 ; // defines the current array in 2 parts .
merge_sort (A, start , mid ) ; // sort the 1st part of array .
merge_sort (A,mid+1 , end ) ; // sort the 2nd part of array.

// merge the both parts by comparing elements of both the parts.
merge(A,start , mid , end );
}
}

void merge(int A[ ] , int start, int mid, int end) {
//stores the starting position of both parts in temporary variables.
int p = start ,q = mid+1;

int Arr[end-start+1] , k=0;

for(int i = start ;i <= end ;i++) {
if(p > mid) //checks if first part comes to an end or not .
Arr[ k++ ] = A[ q++] ;

else if ( q > end) //checks if second part comes to an end or not
Arr[ k++ ] = A[ p++ ];

else if( A[ p ] < A[ q ]) //checks which part has smaller element.
Arr[ k++ ] = A[ p++ ];

else
Arr[ k++ ] = A[ q++];
}
for (int p=0 ; p< k ;p ++) {
/* Now the real array has elements in sorted manner including both
parts.*/
A[ start++ ] = Arr[ p ] ;
}
}

Quick-Sort Algorithm

This is not stable however this is in-place sorting algorithm.
Best and Average Case Complexity: O(NlogN)
Worst Case Complexity: O(N2)

Advantages

  • It is a divide-and-conquer algorithm that makes it easier to solve problems.
  • It is efficient on large data sets.
  • It has a low overhead, as it only requires a small amount of memory to function.

Disadvantages

  • It has a worst-case time complexity of O(N2), which occurs when the pivot is chosen poorly.
  • It is not a good choice for small data sets.
  • It is not a stable sort, meaning that if two elements have the same key, their relative order will not be preserved in the sorted output in case of quick sort, because here we are swapping elements according to the pivot’s position (without considering their original positions).

Algorithm

static int partition(int[] arr, int low, int high)
{
// Choosing the pivot
int pivot = arr[high];

// Index of smaller element and indicates
// the right position of pivot found so far
int i = (low - 1);

for (int j = low; j <= high - 1; j++) {

// If current element is smaller than the pivot
if (arr[j] < pivot) {

// Increment index of smaller element
i++;
swap(arr, i, j);
}
}
swap(arr, i + 1, high);
return (i + 1);
}

// The main function that implements QuickSort
// arr[] --> Array to be sorted,
// low --> Starting index,
// high --> Ending index
static void quickSort(int[] arr, int low, int high)
{
if (low < high) {

// pi is partitioning index, arr[p]
// is now at right place
int pi = partition(arr, low, high);

// Separately sort elements before
// partition and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}

Thanks for Reading. Claps are welcome.

Did you know you could clap 👏 as much as 50 times 😊?

Until Next Time

--

--

Saquib Aftab
Javarevisited

Lead Software Engineer | Java | Technology | Productivity | Learning and Improving to become a better developer