# From Big NO to Big O

Becoming familiar with the concept of Big O notation as a bootcamp grad.

Having no CS background and the highest level of math I’ve taken be integrated algebra, algorithms was a subject that I was dreading. During my time at the Flatiron School, I thoroughly enjoyed DOM manipulation and was excited whenever anything frontend related came up, but there was always the looming world of algos that I knew I would eventually have to tackle. My fear came from not being able to grasp the concept of Big O and becoming lost in the logical process of algos. Don’t even get me started on getting an algorithm on the spot for technical interviews. This week I’ve decided to take a step out of my comfort zone (frontend/design) and finally get acquainted with the world of algorithms.

# Defining Big O Notation

Picture this, you’re in your home and you suddenly have a craving for pie. You bust out your nan’s secret key lime pie recipe and it’s heaven. You decide to bake your nan’s pie for your friends, but suddenly someone else says that THEIR pie is better than yours. Suddenly, war breaks out because there are too many pies that all claim they are better than the other. In efforts to prove that your nan’s recipe is the best one, you’ve forgotten that a key lime pie may not always be best for every occasion and that there are many different types of pies out there. To make peace, a annual pie competition is made where pies are judged by all sorts of different categories and occasions. You even learn new pie recipes for you to try out!

Now back to the present. If you replace every mention of “pie” with the word “algorithm” in the above analogy, you now understand the need for Big O Notation. Algorithms help solve problems through step by step procedures. An algo might behave differently depending on the amount of data it needs to process. Sure your first full stack application runs smoothly with only 10 users, but what happens if you have 100k users on at the same time? Big O Notation is a way to evaluate an algorithm’s performance (time or space/memory) as its input increases. Some algos may be blue ribbon worthy, while others might be an honorable mention depending on what is needed.

By creating a graph with one side keeping track of the amount of data inputs (X axis) and the other measuring how much time or space an algo takes (Y axis), we begin to see certain patterns being made as points are plotted. We connecting the points, we begin to see patterns or trends that are used to help identify an algos performance. The trend that an algo follows is referred to as its time or space complexity. This is where all those O(n) or O (1) (read as O of ___) come into play and are some of the trends that algos can create. We need Big O in order to see which solution or algo is best for the problem at hand.

## Time Complexity 🕰

Personally I find time complexity a bit easier to grasp so let’s start there. An algo’s input is referred to as “N” when talking about what sort of trend it follows. So in regards to time, as `n`

increases, we’re looking to see what sort of trend an algo makes. Time complexity measure an algo’s runtime as the amount of data increases. Now here’s a small tricky bit. When I say runtime, I don’t mean the actual amount of milliseconds it takes to run the code. That time can vary significantly according to different factors. We want something more reliable so Time complexity is in regards to the amount of time it takes for the code to run by counting the amount of operations (addition, subtraction, etc.)within that algorithm. My current mac could be considerably slow in comparison to others so it might affect the time my code runs if I’m counting the exact milliseconds it takes, but if I’m using time complexity instead, it serves as a better comparison across different computers.

## Space Complexity 🛰

Not only do algorithms vary in time complexity as the amount of input increases, but what also increases is the space needed to run it. Going a little deeper, the space taken up by an algo **by itself **is called “auxiliary space complexity”, because guess what: the **input** also takes up space. For the purpose of this explanation, let’s think about space complexity in reference to itself and the space within. So what exactly do we mean by “space”. Time complexity was in reference to operations, but space complexity will deal with the data values in it (i.e booleans, arrays, null, etc. ). The amount of space needed for each value differs and will result in an overall space complexity that follows the same trends we would see in time complexity. I’m still getting the hang of this one because space complexity should also include the space the input will take, but for now it’s good to start with the memory needed just within an algorithm.

# Why Use It?

As of today, I still am a little apprehensive about algorithms and Big O Notation, but I have a new appreciation for them. By keeping Big O in mind, it can make the difference between a slow government website experience and a website that’ll keep users coming back because of how reliable it is. So look out for those Udemy algorithm sales, collect your resources, buy a small white board, and bust out your nan’s secret pie recipe because Big O is here to stay.