Recursion, A Prelude to Merge Sort

Scott Young
4 min readJun 22, 2017

--

Last week we implemented and discussed the insertion sort in Ruby. We saw that insertion sort was fairly intuitive from a human beings point of view, but as far as sort algorithms go, it was not very efficient. Rather than dive right into the merge sort this week we are going to talk about recursion, a pattern that can be found all around us and is critical to understand in order to implement the merge sort. Don’t let the name , or what you might have heard about recursion in the past scare you, it’s really not that difficult. Next week we will implement the merge sort in Ruby, but in order to do that, we are going to dive into recursion this week.

A recursive algorithm is simply an algorithm that calls itself to solve a problem in progressively simpler steps. Let’s take a look at a simple recursive algorithm that prints ‘Hello World’ n times recursively.

Lets step through this algorithm line-by-line. Assuming we call this method like so, HelloWorldRecursive(3) . You will see on line 2 above that we first check to see if n is less than one. This critically important step in all recursive algorithms is called the base case. It’s the case or sometimes cases that trigger a recursive algorithm to exit. Without this check we would be running this algorithm endlessly. Eventually running out of memory. All recursive algorithms must have a base case, or in some situations more than one base case.

Next, starting on line 4, assuming the base case was not triggered, we begin doing some recursive work. First, we simply print Hello World to the console, simple enough. Next comes the recursion, notice on line 6, we call the method we are already in! We do this by decrementing n by one. This is also critically important. If we didn’t decrease n, we would never reach our base case, and be stuck in an infinite loop.

Ultimately, this is a very contrived example that could have been solved much more easily with a simple loop. However, it illustrates the two critical pieces involved when designing a recursive algorithm.

  • A recursive algorithm must have a base case that allows the algorithm to end.
  • A recursive algorithm must manipulate the data in question in some way so as to eventually trigger the base case.

Without both of the above requirements an infinite loop is inevitable, which, no doubt, is not the desirable outcome.

Hopefully that silly, simplified example helped explain the basics of recursion in a way that clears up any confusion you might have had. Let’s solidify that knowledge a bit by trying solving a problem that actually lends itself a little bit better to recursion, factorials.

What is a factorial you might be asking, or you might not, if you know what a factorial is already. For those of you that don’t let’s have a quick look. Very basically the factorial of a number, n, is n multiplied by all the numbers less than n down through 1. More formally, n! = n * (n-1) * (n-2) * … * 2 * 1. For example 4! = 4 * 3 * 2 * 1 = 24.

Maybe you are already seeing the recursive pattern. We can solve 3! by finding out what 3 * 2! is. We can then solve 2! by finding out what 2 * 1! is. But wait 1! is 1. Ahh, we have found our base case. So let’s take a crack at coding this.

This pattern should look very familiar to our Hello World recursive algorithm above. We start by checking our base case, if n == 1 then we simply return 1 because we know that 1! is one. Otherwise we call our factorial method, recursively, and ask it what n * factorial( n — 1 ) is. This fulfills our second requirement for a recursive algorithm stated above. By removing 1 from n for each recursive call of factorial() we know that eventually n will equal 1 and our recursive algorithm will begin to return those results. The image below might help clear up how these return statements work.

http://ksuweb.kennesaw.edu/~bsetzer/1302sp16/nanoc/output/notes/images/figure-18-02-factorial-call-return-trace.png

Note that the above image illustrates an algorithm that checks for a base case of n == 0, the result is the same though since 1 * 1 = 1.

Hopefully, recursion is now a little less enigmatic. This was a very simple study in recursion, meant mostly for web developers and other less math intensive groups of programmers. Recursion is a simple idea, that is made complex by the fact that it is not very intuitive to us as human beings. Recursive patterns often don’t jump out at us like some other patterns do. That being said, recursion is everywhere in computer science, from merge sort to binary trees and beyond. Thanks to our newfound knowledge in recursion we can tackle the merge sort next week. See you then.

--

--