Solving recurrences using the Master method

Borzoo Esmailloo
3 min readJul 27, 2019

--

The Master method is a surprisingly straight forward way for determining the asymptotic complexity of an algorithm. All you need to do is remember 3 simple rules and you’re all set.

You must be thinking this is too good to be true, there must be a catch! Well, you’re right. The Master method can only be used if the recurrence has specific properties. But don’t worry, these properties are actually very common for divide and conquer algorithms.

When can the Master method be used

The Master method can be used for recurrences of the form:

T(n) = aT(n/b) + f(n)

Where a ≥ 1 and b > 1 and f(n) is and asymptotically positive function.

Here, we’re dividing a problem of size n to a sub-problems of size b. We will solve the sub-problems recursively and if needed, combine the results. The cost of dividing a problem of size n into sub-problems and combining the results is f(n).

Master method rules

Now that we have confirmed that the recurrence has the appropriate form, we can use these 3 simple rules to determine it’s asymptotic complexity:

Master theorem

We’re going to discover why these rules make sense, but first, let’s try it out with some samples.

Let’s start with a familiar algorithm, depth first traversal of binary trees. The algorithm traverses the tree and does some operation on it’s nodes in a specific order. We’ll use the in-order tree traversal algorithm here.

inorder(tree) 
if tree is empty, return.
inorder(tree->left)
visit the root of the tree
inorder(tree->right)

If you remember, the recurrence for an algorithm consists of the cost of dividing and combining the results and the cost of recursively solving the sub-problems.

Here, the cost of dividing the problem into sub-problems is trivial. For dividing, we’re accessing tree->left and tree->right which take constant time. Cost of combining (even though we’re not combining anything here, let’s keep calling it combining for the sake of consistency) we’re visiting the tree root node, which takes constant time since it’s not related to problem size n. Thus our f(n) = 𝚹(1)

Since this is a binary tree, and let’s also assume that it’s a full binary tree, we’re dividing the problem into two sub-problems of size n/2. Thus we end up with the recurrence:

T(n) = 2T(n/2) + 𝚹(1)

Case 1 of Master method applies here for all 0 < ε < 1, thus the solution to the recurrence is 𝚹(n), which makes sense if we observe that for traversing a tree of size n, each node is visited once for a total of 𝚹(n).

As another example, let’s look at the Merge Sort algorithm. Merge Sort solves the problem of ordering a collection by splitting the collection into two (almost) equal parts, sorting those parts recursively and combining the two sorted parts. Pseudo code of the merge sort looks something like this:

merge_sort(collection)
if collection has 1 element, return collection. else:
middle_index = collection.length / 2
left_col = collection[0..middle_part-1]
right_col = collection[middle_part..collection.length-1]
sorted_left_col = merge_sort(left_col)
sorted_right_col = merge_sort(right_col)
merge sorted_left_col and sorted_right_col in 𝚹(n) time

Here, we’re dividing the problem into 2 sub-problems of size n/2. Let’s assume that we can divide the collection in constant time. The two sub-problems are sorted recursively and the results are combined in 𝚹(n) time. Thus the total time needed for dividing and combining the sub-problems is f(n) = 𝚹(n). We end up with the recurrence:

T(n) = 2T(n/2) + 𝚹(n)

Second case of Master method applies here, so the solution to the recurrence is 𝚹(nlgn).

In the next part, we will go over the proof of Master theorem and see why it works.

--

--