AVL Trees: A Complete Guide

Nathan Pillai
6 min readDec 16, 2019

--

Conqueror of the Marvel Universe, Thanos, speaking on the efficiency of AVL Trees

An AVL tree is what is known as a self-balancing binary tree created by Georgy Adelson-Velsky and Evgenii Landis (hence the name: AVL-tree). Before we understand what an AVL tree is and what its benefits are, let us define some of the key terminologies.

Tree height and depth:

Image from growingwiththeweb.com

Height: The height of a tree is the length of the longest path from a leaf node to the desired node you’re looking for.

Depth: The depth level of a node is the length of the path to the root (Not essential to understanding AVL trees, but it’s good to know)

Easy way to remember is to count the number of edges relative to what you’re looking for:

  • Depth: Count # of edges from the desired node to root
  • Height: Count # of edges along the longest path from the bottom of the tree to the desired node

What does a “balanced tree” mean?

Given a binary tree, the rules for its construction are that each inserted node’s value must be either less than, greater than, or equal to the value of its parent. Since a binary tree only has a maximum of two branches, the comparison to the parent determines which branch (left or right) a node should follow in order to be inserted. This process results in us having a left and right subtree from the root node of our tree.

This brings us to the definition of a balanced tree which is that a tree is considered balanced when the difference between heights of the left subtree and the right subtree is not more than 1 unit. Let’s take a look at a few examples…

As we can see in the first image of the balanced tree, the height of the right subtree is 0 and the height of the left subtree is 1. The height difference between these subtrees is (1–0) = 1, which meets our definition of balanced. In contrast, if we take a look at the first unbalanced tree, the left subtree has a height of 0 and the right subtree has a height of 2. The height difference between these subtrees is (2–0) = 2, which makes the tree unbalanced.

The second unbalanced tree is more interesting to look at because it doesn’t even look like a tree! It looks more like a linked list, doesn’t it? I drew a hypothetical left subtree branch node in order to visualize how we are calculating the height difference between the left and right subtrees. Since the left tree didn’t exist, I gave the left subtree a hypothetical height of -1. So (-1–1) = -2, which makes that tree unbalanced. This binary tree is possible to obtain when we insert data into our binary tree in sorted or reverse sorted order. In sorted order, our data would look like a linked list branching recursively to the right. In reversed sorted order, our data would look like a linked list branching recursively to the left. These “linked list trees” are quite literally the worst possible trees to obtain as we will discover.

Time and Space Complexity Analysis Between Binary Search Tree (BST) and AVL Trees:

As you might assume from a description of an AVL tree as “self-balancing”, an AVL balances its subtrees as its being built. We will take a look at this later, but for now, just understand that an AVL tree is ALWAYS perfectly balanced.

Now that we know what a balanced and unbalanced tree looks like, let’s explore the benefits of operating on an AVL self-balancing tree in contrast to a traditional Binary Search Tree.

Both the above images are the coursey of packtpub.com

When it comes to space complexity, an AVL tree doesn’t offer up any benefit, which is to be expected since we are not changing how much data is being stored, but how that data is structured. However, when it comes to the time complexity of any of the tree operations, that's where the AVL tree vastly outshines the traditional binary tree. As you can see the only difference between the two charts above are that the worst-case scenarios for the AVL tree operations are O(log(n)), in contrast to the O(n) worst-case for the exact same operations in a traditional binary tree.

Why is that the worst and best case of the AVL tree operations are the exact same as the BST? This is due to the “self-balancing” aspect of the AVL tree which guarantees us a balanced tree at all times. In a balanced binary tree, searching, inserting, and deleting all take logarithmic time because the number of nodes to go through is cut in half at each level as we traverse down the tree. The whole benefit of storing our data inside a binary tree is to unlock these great runtime complexities which are only possible inside a balanced binary tree, which the AVL guarantees all the time. As we saw in the drawn examples, without balancing, a tree could become as bad a LinkedList, which would mean we would need to traverse through all possible values (O(n) time complexity) in order to do any operation (worst-case)!

The One Rule For Building an AVL Tree:

The difference between the heights of two subtrees in an AVL tree should never exceed 1 level.

This rule is represented by the formula: |h1 — h2| <= 1 where h1 and h2 are the recursive heights of the left and right subtrees respectively. This formula is known as the balance factor. When subtrees violate this balancing factor, that indicates that we need to rebalance that portion of our tree through something known as rotations.

Self Balancing By Rotations: How do we keep our tree balanced?

Single Rotation Image by TutorialsPoint

Now we‘re coming to the part where we learn about what it means for our AVL tree to be “self-balancing”. We have two categories when it comes to rebalancing rotations: left and right rotations. In general, a rotation is simply the swapping of the node that was in imbalance with its parent node and rearranging the pointers as needed. We distinguish rotations as left and right by the direction in which we’re swapping the child node with its parent (See left rotation above).

Double Rotation Image from cs.stackexchange.com

When we insert a new node into our AVL tree that causes an imbalance in a subtree, our tree will do a left or right rotation at the imbalanced node as needed to balance the tree. However, the resulting tree may still be imbalanced as a result of the rotation. No worries! When we make the rotation, we can check if that operation made our tree imbalanced and if so, we can do another left or right rotation as needed. This type of situation is known as a double rotation.

--

--