Time Complexity

Rafael
4 min readJul 14, 2024

--

Photo by Miguel Á. Padriñán from Pexels: https://www.pexels.com/photo/analog-clock-sketch-in-black-surface-745365/

What is Time Complexity?

Time complexity is the amount of time an algorithm takes to run in relation to the input size. It is the most important factor to consider when analyzing and comparing algorithms.

How do We Calculate Time Complexity?

We don’t calculate time complexity based on the actual running time of an algorithm, as that is dependent on the machine, programming language, compiler, and so on. Instead, we calculate the time complexity based on how many instructions need to be executed for the algorithm to complete its job for an input of a certain size, which we call n.

Basic example

Now, suppose we want to sum up all the numbers between 1 and 1000 (inclusive), and so we set up the following algorithm.

Simple algoritthm to sum up all the numbers from 1 to 1000.

As you can see, the number of steps executed is dependent on the value of n. If n is 5, it takes 5 steps. If n is 500, it takes 500 steps. So the algorithm takes n steps/instructions.

Frequency count method

This method is utilized to calculate the time complexity of an algorithm and is based the number of steps it takes to fulfill its task.

For instance, if we put the previous algorithm into a function and calculated its time complexity using the frequency count method, we’d assign 1 unit of time to each statement and sum the number of time each statement would be executed. So

Time complexity calculation

The number of instructions to be executed as a polynomial function is f(n) = 3n + 3, which means that the above program takes 3n + 3 instructions/steps to finish the work, given an input of size n.

Expressing in Big O

Now, in order to express the time complexity of this algorithm in Big O, we need to determine which term of the polynomial has the highest power of n, aka the degree of the polynomial.. Once we know that, we take n together with that power without the coefficient (the constant number multiplying n), and the result (n^x) is used as the function inside Big O.

In the previous example, f(n) = 3n + 3, determines the number of steps required for the sum algorithm to complete its job. Following the steps described above, 1 is the degree of the polynomial because 3n = 3n¹, which is the only and highest power of n. So we have 3n. As we don’t include the coefficient, we are left with n. So the time complexity of the sum algorithm, when stripped of constants and non-dominant terms, is n, which is generally denoted using the Big-O notation. So we use O(n), which is said to be a linear time complexity, because the number of steps increases linearly as the input grows.

As another example, suppose we created an algorithm and calculated how many steps it takes to execute and the result was f(n) = 4n⁴ + 4n² + n + 4. The term of this polynomial containing the degree is 4n⁴. Removing the coefficient, we can see that the time complexity of the algorithm is n⁴, which is usually presented as O(n⁴).

Note that, as said in another article Asymptotic/Complexity analysis, we ignore details such as constants and non-dominant terms when analyzing algorithms and calculating time complexity.

A recursive example

Function to find the factorial of a number

Here we have a recursive function to calculate the factorial of a given number.

Here, factorial is going to be called n times recursively, and thus the time complexity is O(n).

To understand it, imagine we’re calling factorial with the number 3. So factorial is called for the first time with 3. Then, when it reaches the return statement, factorial calls itself with the number 2. Then, when it reaches the return statement again, factorial calls itself with 1, reaching the base case, allowing all previous calls to return a value. Thus, factorial is always called n times and its time complexity is O(n).

Why should you care about time complexity?

Understanding time complexity is essential for any programmer to optimize the algorithms they build. By analyzing algorithms and determining their time complexity, you can come up with optimizations that improve the performance, scalability, speed, and overall efficiency of the program.

Keep in mind that there may be trade-offs between time and space complexity, and which one you’ll prioritize depends on your situation. For example, in scenarios with limited memory resources, such as embedded systems or mobile applications, you might prefer algorithms that use less memory, even though they may take more time to execute.

Donate

Did you like this article? If you did, please consider buying me a coffee.

--

--

Rafael

Hey, there! I'm Rafael Maia, a self-taught front-end developer sharing the knowledge that I acquire along my journey with you in the simplest way I can.