Big O Notation Basics

I overheard a bunch of my classmates talking about a book called Cracking the Coding Interview. All of them said they either had the book or had read it before, so later that night I ordered my copy from Amazon Prime.

The first or second chapter of the book is all about Big O Notation. I’d never heard of the Big O before and didn’t expect to understand absolutely none of it. I even checked to make sure I’d purchased the correct book and if the language in which it was written was in fact English. It was.

Thinking about algorithms hurts my brain. For the less mathematically inclined, this is a difficult concept to understand. I’m going to need to make friends with them if I want to be a developer. My goal here is to find a working definition of what the Big O that makes sense to me.

I’ve seen the Big O described as the ‘asymptotic upper limit of a function.’ An asymptote is a line that a curve approaches as it goes on and on towards infinity, like this:

In programming, the Big O is used to classify algorithms based on runtime efficiency and space requirements as the input size grows. In plain english, the Big O describes the limiting behavior of a function when the argument tends towards infinity…or just what will happen to your program’s efficiency as inputs get larger and larger.

The other day, I was on the phone with a Senior Software Engineer from Google — he was describing the interview process and why they are so heavily reliant on algorithmic questions. His analogy (inspired by the view from his porch) was of a program to count all the deer in his yard. He described a simple program with a counter. But what if we wanted to extend this program to the neighbors so they too could count the deer in their yards. Would the program still work? And if everyone in California were to use this program, would it work then? How about for all of the United States or the entire Northern Hemisphere? All of planet Earth (I mean, we are talking about Google here)? It just keeps going, on and on, until someone says ‘uncle’ or time runs out.

Everyone’s seen a graph like this, depicting Big O complexity, or the number of operations needed to solve a problem. It compares the runtimes of various algorithms. The ‘O’ in O(n) represents the Order function and reads like ‘O of N.’

In base 10, an increase of one order of magnitude is the same as multiplying a quantity by 10. An increase of two orders of magnitude is the equivalent of multiplying by 100, or 102. In general, an increase of n orders of magnitude is the equivalent of multiplying a quantity by 10n

So, the graph shows the time it would take to run a function with different sized inputs.

| | | const last = (list) => |
| 1 | Constant | list[list.length-1] |
| log(n) | Logarithmic | Binary Search |
| | | for(let i = 0; i < 10; i++) { |
| | | console.log(i) |
| n | Linear | } |
| n log(n) | Linearithmic | Heap/Merge/Quick Sort |
| n^2 | Quadratic | Nested Loop |
| n^3 | Cubic | Triple Nested Loop |
| 2^n | Exponential | Every combo or permutation |