# Complexity and Big-O Notation In Swift

## Time Complexity

# Introduction

In this post, I will touch on **complexity** and **Big O Notation**. The term complexity as it relates to programming doesn’t necessarily mean one thing. There are different types of computational complexity. The two most common types of complexity are time complexity — which is the number of operations in regards to input and space complexity, which is the number of places used for storage in a given operation in regards to its input. For simplicities sake, in this post, I will be referring to time complexity.

**Asymptotic Analysis**

“*Asymptotic analysis is the process of describing the efficiency of algorithms as their input size (n) grows.*”

**Swift Algorithms and Data Structures**

In programming, asymptotic analysis is generally described using **Big O Notation**.

**Worst case scenario**

When referring to the worst-case scenario with complexity, we are referring to an input that causes maximum complexity. The difference between a best/worst case scenario for complexity may be significant. The difference between best case scenario and worst scenario for searching a hash table can go from constant **O(1)** time to linear **O(n)** time. If you are searching for an element in an array by iterating over each item and that element is the first item in the array, you will have a pretty good runtime. It’s another story if that item is the element in the array. Generally speaking, you don’t want to pretend like every situation is going to give you the optimal time.

**O(1) — Constant Time**

An **O(1)** operation’s complexity is constant regardless of the number of inputs. Accessing the first element in an array will always be O(1). It is the same for an array with 1000 elements as it is for an array with 10. In fact accessing any element in your array by its index should be a constant complexity operation regardless of whether it is the first item or the 1000th.

**Examples of 0(1) operations:**

*Looking up element in a hash table.**Adding a node to linked list.**Accessing an element in an array by index.*

**Accessing an element in an array by index**:

**Adding a node to a linked list**:

# O(log n) — Logarithmic Time

When dealing with an O(log n) complexity operation, each cycle should reduce the complexity in relation to input. A prime example of this is a binary search. Each iteration halves the interval within which it searches.

**Examples of 0(log n) operations:**

*Binary search*

**Binary search in Swift**:

# O(n) — Linear Time

An O(n) operation’s complexity scales linearly with the number of inputs. If you iterate through all the elements in an array, it is O(n), because the operation is dependent on the number of items in your collection. Going through 10 elements in an array requires ten cycles. If you multiply the array count by 10, the cycles increase to 100 or 10 times as much as ten elements.

**Examples of 0(n) operations:**

*Traversing an array or linked list*.*Removing a node from a specific index of a linked list.*

**Remove linked list node at specified index**:

Traverse Array:

# O(n²) — Quadratic Time

An O(n²) operation’s complexity scales exponentially with the number of inputs. A simple example of an O(n²) is a process with a loop within a loop. If you took an array with six elements and for each element of the array accessed nth element in the range of 0..<array.count you would access your array 36 times. Your complexity is not scaling directly with input, but for input squared. A worst case scenario for a bubble sort is O(n²).

**Examples of 0(n²) operations**:

*Traversing a 2D array*

**Traversing 2D Array in Swift**:

Sources: