Heap Sort — Concepts and Visualization

Nash
4 min readOct 2, 2022

Heap Sort algorithm work by splitting array into sorted and unsorted arrays, then repeatedly extracting the minimum element from unsorted subarray and moving it to the sorted subarray. This concept is similar to selection sort, however, heap sort maintains the unsorted array in a heap data structure to make it easier to find the smallest element in each iteration.

Detailed Explanation

First , we create a heap data structure from array (it’s similar to a binary tree) the key property of the heap data structure is that the root node of the heap will be the smallest element (in the case of a min heap).

Source: GeeksForGeeks

The sorting process takes place when we take out the root node of the heap and put it in a sorted array. Now the problem is that the heap will be missing a root and losing its property.

So to solve that we then pick a leaf node to substitute the root node of the heap. Since the heap must preserve its property where min element must be at the root, there will be a process to adjust the heap to maintain its property (we called this process “heapify”)

The black element is the heap (the root of the heap is the most black)

After the heapify process is finished, we then remove the root node and add it to the sorted array again, we repeat this process until the heap becomes empty (no more root to remove). At that time, all of the items in the array would be already sorted.

Pseudocode

  1. Main Function
minHeap =  new MinHeapFor each element in arr
minHeap.add(element)
// Build a heap consists of all elements in array.
For i = 0 to (i < N)
root = minHeap.removeRoot();
// The root will always be the min element in heap.
arr[i] = root;

2. MinHeap Class

class MinHeap {
heap = []

add(int element){
heap.push[element]
heapifyUp();
}
removeRoot(){
if(heap.length == 0) return
if(heap.length == 1) retrun heap.pop()
rootNode = heap.shift()
leafNode = heap.pop()
heap.unshift(leafNode) // replace root with leafNode.
heapifyDown();
return rootNode
}
heapifyUp(){
// When we add element to the end of the heap, we might have to move them up.
leafNode = heap[heap.length - 1] // rightmost leafNode
parentNode = getParentNode(leafNode)

While(leafNode < parentNode)
// leafNode is smaller, and should go upper.
swap(leafNode,parentNode)
leafNode = parentNode
parentNode = getParentNode(leafNode)
}

heapifyDown(){
// When we insert leaf node to the root of the heap, we might have to move them down. rootNode = heap[0]
leftChildNode = getLeftChildNode(rootNode)
rightChildNode = getRightChildNode(rootNode)

While(rootNode > leftChildNode or rootNode > leftChildNode){ // rootNode is bigger, and should go lower.
targetChildNode // to be swap with rootNode

// case 1 only have one child node.
if(haveOneChild(rootNode))
targetChildNode = isleftChildNodeEmpty(rootNode)?rightChildNode:leftChildNode
// check wheter it left or right node.

// case 2 have two child node.
else{
targetChildNode = rightChildNode<leftChildNode? rightChildNode:leftChildNode
// choose the node that is smaller.
}
swap(rootNode,targetChildNode)
rootNode = targetChildNode
leftChildNode = getLeftChildNode(rootNode)
rightChildNode = getRightChildNode(rootNode)
}
}

}

Time Complexity

In heap sort, we spend most of the time to keep preserving the heap property so that we can pick the root of the heap and get the proper element whether it is a min element or max element. Or in other words, we spend most of the time in heapifying function. Since heap used binary tree structure, the tree traversing process will take log(n) operations for each heapifying process. And we heapify n times since we heapify every after root removal. Thus, the complexity is O(nlog(n)).

That’s it!

Thanks for reading and I hope you enjoy my article. Any feedback or mistakes correction will be appreciated.

Wanna play around?

Want to adjust the speed to be slower? Or maybe increase the array size up to 1500 items? Then feel free to go ahead and try it on my website at sortlab.click

sortlab.click

--

--