Big O Notation- Into & Overview

Sadia Badhon
4 min readOct 5, 2020

In many programming languages there are often multiple ways of completing the same task, this is especially true for JavaScript. So in order to converse about the comparisons between multiple implementations if the same function or different code that completes the same task we use Big O notation.

The chart above wont make much sense just yet, however my hope is by the end of this article you’ll have enough of an understanding of Bio O to understand the general trends displayed.

Big O is also useful in talking about tradeoffs between approaches as well as find weak points in your code. Let’s not forget INTERVIEWS! Almost always, especially if you have time left over, you’ll be asked to talk about the efficiency of your code and that’s when you can let your Bio O knowledge shine.

Overall Bio O isn’t something we can avoid in programming but it seems to be one of the more daunting topics, so lets just right into it!

We use big O notation to discuss the time and space complexity of our code. Time complexity is how we analyze the runtime of an algorithm as the size of the inputs increase. Space complexity is how much additional memory is needed to run the code in our algorithm. With space complexity it is important to note that we are referring to the Auxiliary Space complexity, which refers to the space required by the algorithm NOT including space taken up by the inputs.

I will also go over Logarithms as there are some algorithms that involve logs, such as sorting, searching and sometimes recursion. I wont go too deep into the math but this is something that we have to keep in mind. Due to the fact that Bio O and Optimization is a fairly large topic this is more of a Part 1 for an article series.

How do we calculate Bio O of Time complexity?

Simply put we count the number of simple operations the computer has to perform. This makes more sense than to time the code itself using built in timers because different computers can differ in their processing times and thats not a correct assessment of the runtime of the code. When we count simple operations, that stays constant no matter what computer we’re working on.

Here are a few examples of runtimes (there are many more but these 3 are the most common):

  1. Constant: f(n) = 1 → O(1)
  2. Linear: f(n) = n → O(n)
  3. Quadratic: f(n) = n² → O(n²)

How do we calculate the number of simple operations?

  1. Constants don’t matter because they don’t take up time. Examples:
    a. Arithmetic Operations ( +, -, *, \ )
    b. Variable Assignment ( let, const, var )
    c. accessing elements in an array by index or an object by key
  2. Loops are complex: The complexity is the length of the loop times the complexity of whatever is inside the loop

How do we calculate Bio O of Space complexity?

We can calculate the Auxiliary space complexity with a few rules of thumb:

  1. The most primitive datatypes such as booleans, numbers, null, undefined) are considered to be constant space.
  2. Strings require O(n) space where n is the length of the string
  3. Reference types are also generally O(n) where n is the length for arrays or keys for objects.

Logarithms

A logarithm is the inverse of an exponentiation. There are many bases you can use. A common one is the binary logarithm which is Log base 2. When writing Bio O notation that involves logs we tend to drop the base even though when writing regular log expressions we must include it.

The logarithm (base 2) of a number roughly measures the number of times you can divide that number by 2 before you get a value thats less than or equal to one (≤1)

--

--