# Big O Time Complexity 101 for Swift Developers

When I was less experienced as a software engineer, I had the misconception that learning data structures and algorithms (DSA) were not necessary for my career. In my arrogance, I even declared that any employer who required knowledge of DSA would not be worth interviewing for.

However, as I gained more experience in the field, I realized how wrong I was. DSA is a fundamental concept in computer science and plays a crucial role in understanding how to design and optimize software. It’s not only necessary for passing technical interviews but also for writing efficient and scalable code.

I’ve come to understand that understanding DSA is essential for any serious software engineer. It’s not just a tool for passing interviews, but it helps you design and implement better software, ultimately benefiting your employer, the end user, and you. I’m now making an effort to learn and improve in this area. These are my learnings so far, I hope you’ll gain either knowledge or inspiration from what I have to write. By the end of the post, you should have a fundamental understanding of DSA and be able to apply these concepts in your projects.

# The 3 rules of Big O

- Growth is with respect to the input
- Constants are dropped
- The worst case is usually the way we measure

# Warm-up

Before diving into data structures and algorithms, it’s important to understand the time complexity of some basic array methods in Swift. Time complexity refers to the amount of time an algorithm takes to run in relation to the input size.

Here are some common array methods in Swift and their time complexity:

`append(_:)`

: O(1) - appending an element to the end of an array is a constant-time operation, regardless of the size of the array.`insert(_:at:)`

: O(n) - inserting an element at a specific index requires shifting all elements after that index, so the time complexity increases linearly with the size of the array.`remove(at:)`

: O(n) - similar to inserting, removing an element at a specific index requires shifting all elements after that index, so the time complexity increases linearly with the size of the array.`index(of:)`

: O(n) - finding the index of a specific element in an array requires iterating through the entire array, so the time complexity increases linearly with the size of the array.`sorted()`

: O(n log n) - sorting an array using the built-in`sorted()`

method has a time complexity of O(n log n), where n is the size of the array. This is because the sorting algorithm used is typically a variation of quicksort.

# Are you warmed up? Main course time.

This section will introduce common algorithms and how they can be applied in Swift. I will provide code samples in Swift to demonstrate how they work and their time complexity. Specifically, I’ll cover the following topics:

- I’ll start by discussing
**linear search**, a simple algorithm for finding a specific element in an array. - Next, I’ll cover
**binary search**, an efficient algorithm for searching through a sorted array. - Finally, I’ll cover a basic sorting algorithm using
**Bubble sort**.

I will compare and contrast the different algorithms, highlighting their strengths and weaknesses and when to use each one.

**Linear search**

**Linear search**, also known as sequential search, is a simple algorithm for finding a specific element in an array. The algorithm starts at the beginning of the array and compares each element to the target element until it is found or the end of the array is reached.

Here is a code sample in Swift that demonstrates how to perform a linear search:

`func linearSearch(array: [Int], target: Int) -> Int? {`

for (index, element) in array.enumerated() {

if element == target {

return index

}

}

return nil

}

let array = [1, 2, 3, 4, 5]

let target = 3

let index = linearSearch(array: array, target: target)

print(index) // Output: 2

In this example, the linearSearch function takes in an array of integers and a target integer as its parameters. The function uses a for loop to iterate through the array and compares each element to the target. If a match is found, the index of the element is returned. If the end of the array is reached without finding a match, the function returns nil.

The time complexity of linear search is **O(n)**, where n is the size of the array. This means that as the size of the array increases, the number of operations required to find the target element also increases linearly. While linear search is a simple and straightforward algorithm, it can become inefficient for large arrays. More efficient algorithms, such as binary search, should be considered in such cases.

**Binary search**

**Binary search** is an efficient algorithm for searching through a sorted array. The algorithm repeatedly divides the array in half, eliminating half of the remaining elements with each comparison, until the target element is found or it is determined that the element is not present in the array.

Here is a code sample in Swift that demonstrates how to perform a binary search:

`func binarySearch(array: [Int], target: Int) -> Int? {`

var left = 0

var right = array.count - 1

while left <= right {

let middle = (left + right) / 2

let guess = array[middle]

if guess == target {

return middle

}

if guess > target {

right = middle - 1

} else {

left = middle + 1

}

}

return nil

}

let array = [1, 2, 3, 4, 5]

let target = 3

let index = binarySearch(array: array, target: target)

print(index) // Output: 2

In this example, the binarySearch function takes in an array of integers and a target integer as its parameters. The function uses a while loop to repeatedly divide the array in half, eliminating half of the remaining elements with each comparison. The variable `left`

and `right`

keep track of the range in which the target can be found. In each iteration, the middle index of the range is calculated, and the element at that index is compared to the target. If a match is found, the index of the element is returned. If the target is greater than the element at the middle index, the left range is updated to one element greater than the middle index. If the target is less than the element at the middle index, the right range is updated to be one element less than the middle index. If the left range exceeds the right range, the function returns nil, indicating that the target element is not present in the array.

The time complexity of binary search is **O(log n)**, where n is the size of the array. This means that as the size of the array increases, the number of operations required to find the target element increases logarithmically. This makes binary search a much more efficient algorithm than linear search for large arrays. However, it is important to note that **the array must be sorted** for a proper binary search. The array must be sorted because the algorithm repeatedly divides the array in half and uses the middle element as a pivot point to determine whether the target element is in the left or right half of the array. If the array is not sorted, the pivot point will not accurately represent the elements in the left or right half, and the algorithm will not be able to eliminate half of the remaining elements with each comparison, making it inefficient and less accurate.

**Bubble sort**

**Bubble sort** is a basic sorting algorithm that repeatedly steps through the array, compares adjacent elements, and swaps them if they are in the wrong order. The pass through the array is repeated until the array is sorted.

Here is a code sample in Swift that demonstrates how to perform a bubble sort:

`func bubbleSort(array: inout [Int]) {`

for i in 0..<array.count {

for j in 1..<array.count - i {

if array[j] < array[j-1] {

swap(array: &array, first: j-1, second: j)

}

}

}

}

func swap(array: inout [Int], first: Int, second: Int) {

let temp = array[first]

array[first] = array[second]

array[second] = temp

}

var array = [5, 3, 2, 4, 1]

bubbleSort(array: &array)

print(array) // Output: [1, 2, 3, 4, 5]

In this example, the `swap`

function is a helper function that takes in an array, an index of the first element, and an index of the second element, and it swaps the elements at those indexes. The `bubbleSort`

function takes in an array of integers as its parameter and sorts the array in *ascending* order. The function uses nested for loops to step through the array repeatedly. The outer loop iterates through the array and the inner loop compares adjacent elements. If they are in the wrong order, it calls the `swap`

function passing the array and the indexes of the elements to be swapped. The variable `i`

keeps track of how many elements have been sorted and the variable `j`

iterates through the unsorted elements.

The time complexity of bubble sort is **O(n²)**, where *n* is the size of the array. This means that as the size of the array increases, the number of operations required to sort the array also increases quadratically. Bubble sort is easy to understand and implement, but it’s inefficient for large arrays and not used in practice for large datasets.

Bubble sort can be improved by checking if the array is already sorted and if it is the function returns. This is called optimized bubble sort, which reduces the time complexity to **O(n)** in the best-case scenario where the array is already sorted.

# Conclusion

Understanding DSA is crucial to becoming a proficient software engineer. In this post, I’ve covered three common algorithms: linear search, binary search, and bubble sort, and discussed their time complexity and when to use them. Remember that these are just a small sample of the many algorithms and data structures available; *there’s always more to learn*. By understanding the time complexity and characteristics of different algorithms and data structures, you will be better equipped to make informed decisions when designing and optimizing your code.

Thanks for reading

Special mentions go to ThePrimeagen and his free DSA course on Frontend Masters and my colleagues at Ōura who keep me motivated to become better.