Complexity analysis of recursive functions.

John Anto
4 min readMar 18, 2024

--

Recursion and Javascript

Introduction

Recursion in programming is a technique where a function calls itself to solve a problem by breaking it down into smaller instances of the same problem. This approach allows for elegant and concise solutions to problems with a self-similar structure. Recursion requires careful consideration of base cases to prevent infinite loops. Understanding recursion is fundamental for mastering many programming concepts and algorithms.

Recursive Function

A recursive function has two important code blocks: a base case (ends recursion) block and a recursive case (resumes recursion) block. A recursive function calls itself until the base case condition is met.

In Javascript there are three ways for a function to refer to itself:

  1. using the function’s name.
  2. arguments.callee()
  3. An in-scope variable that refers to the function.

For example:

const func = function someFunc() {
// statements go here
};

Within the function body, the following are all equivalent:

  1. someFunc()
  2. arguments.callee()
  3. func()

In some ways, recursion is analogous to a loop. Both execute the same code multiple times, and both require a condition.

For example, The while loop below,

const doStuff = (n) => {console.log(n)};

// prints numbers from 0 to 9.
let x = 0;
// "x < 10" is the loop condition
while (x < 10) {
// do stuff
doStuff(x);
x++;
}

can be converted into a recursive function as:

function doStuff(x) {
// "x >= 10" is the exit condition (equivalent to "!(x < 10)")
if (x >= 10) {
return;
}

console.log(x);

// recursive do stuff
doStuff(x + 1); // the recursive call
}

// do stuff
doStuff(0);

However, some algorithms cannot be simple iterative loops. For example, getting all the nodes of a tree structure is easier via recursion. It is possible to convert any recursive algorithm to a non-recursive one, but the logic is often much more complex, and doing so requires the use of a stack. So recursion is limited by stack size.

A real case example usage of recursion — Fibonacci series number at position n,

const fibonacci = (n) => (n <= 2 ? 1 : fibonacci(n - 1) + fibonacci(n - 2));
console.log(fibonacci(10)); // 55

Recurrence relation

A recurrence relation is a mathematical equation where any term is defined using its previous terms. We can use recurrence relations to analyze the time complexity of recursive algorithms. Mathematically we can say that, if the time complexity function of input size n is T(n), then the time complexity of the smaller subproblems will be defined by the same function, with the subproblem’s input size. ie.,

T(n) = T(Input size of 1st subproblem) + T(Input size of 2nd subproblem) + …. + T(Input size of kth subproblem) + Time complexity of additional operations other than recursive calls.

Steps to analyze the time complexity of recursive algorithms:

  1. Identify the input size of larger and smaller subproblems.
  2. Write the recurrence relation for the time complexity.
  3. Solving recurrence relation to get the time complexity using recursion tree or master theorem approach.

Analysis of merge sort

Merge sort is a recursive algorithm that continuously splits the array in half until it cannot be further divided i.e., the array has only one element left (an array with one element is always sorted). Then the sorted subarrays are merged back into one sorted array.

Let’s say we have to sort n size array using merge sort. Let the time complexity be T(n). We can then start with c * n at the ground level, which represents the cost of additional operations at the ground level of the recursion tree. Now the left and right children of the root node have the time complexities of T(n/2) because the n is divided by 2 on each level of the recursion tree. So at the second level, there are two nodes of size n/2, and the cost of the additional operations at each node is c * n/2, which is again c * (n /2) + c * (n/2) = c * n, for the second level nodes combined.

Similarly, if we move to the below levels we can see the total complexities remain the same c * n. ie., on level three we can say we have 4 nodes with time complexities of T(n/4), which is half of the input size of the second level. So total time complexity will be c * (n/4) + c * (n/4) + c* (n/4) + c* (n/4) = c * n, for third level nodes combined.

In general, we can say that at any level i, total complexity is,

which is always c * n.

Here the tree will have log n levels because at each level, the input size is divided by 2 and it continues until the last level has input size 1, making it logarithmic.

So Total time complexity will be the addition of all time complexities in each level, meaning T(n) = c * n * log n

After ignoring the constants, we can say that T(n) = n * log n. So Big-O notation of the merge sort is O(nlog n).

Conclusion

While recursive algorithms often provide elegant and intuitive solutions to problems, they may incur higher time complexity compared to iterative approaches due to repeated function calls and overhead. Therefore, it’s essential to carefully analyze the recursion tree and identify patterns to derive the time complexity equation.

Watch the recursion video on YouTube: https://www.youtube.com/watch?v=UvU6T-mOyao

Follow me on YouTube for more insightful videos: https://youtube.com/@coderoly

Follow me on medium: https://medium.com/@developer.johnanto

--

--

John Anto

A tech enthusiast who talks about programming and works in the front-end engineering space.