Short Introduction to Multithreading and Thread-Safe Operations in Swift.

Dimas Wisodewo
9 min readJun 9, 2023

--

Every application has at least one thread, the main thread, that can delegate tasks to other threads. This delegation is known as multithreading, where tasks are performed concurrently on multiple threads to enhance the application’s performance.

Parallelism and Concurrency.

In a single-core processor, concurrency is achieved through context switching, where the processor rapidly switches between tasks, allowing multiple tasks to make progress concurrently by allocating small time slices to each task. This efficient utilization of the processor’s time enables the illusion of simultaneous execution.

On the other hand, parallelism is achievable in a multi-core processor where there are multiple physical processing units (cores) available. Each core can independently execute tasks simultaneously, without the need for context switching. In a parallel system, tasks can be executed in parallel on different cores, allowing for true simultaneous execution.

Multithreading in Swift

Multithreading in Swift iOS allows you to perform concurrent tasks, improve performance, and create responsive user interfaces. There are several techniques and APIs available to work with multithreading in Swift. Here are a few commonly used approaches:

Grand Central Dispatch (GCD)

GCD is a powerful multithreading technology provided by Apple. It abstracts the complexities of thread management and provides a high-level API for concurrent programming.

  • Dispatch Queues: You can use dispatch queues to execute tasks concurrently. There are two types of queues: serial queues (tasks executed one at a time) and concurrent queues (tasks executed concurrently). You can create your own queues or use the global concurrent queues provided by GCD.
// Dispatch Queue

func task1() {
print("Task 1 started")
Thread.sleep(forTimeInterval: 1)
print("Task 1 finished")
}

func task2() {
print("Task 2 started")
Thread.sleep(forTimeInterval: 1)
print("Task 2 finished")
}

// Perform a task on the serial queue
// In the serial queue, the second task only starts after the first task is finished
let serialQueue = DispatchQueue(label: "com.example.serialQueue")

serialQueue.async {
task1()
}

serialQueue.async {
task2()
}

// Perform a task on the concurrent queue
// In the concurrent queue, both tasks start at the same time
let concurrentQueue = DispatchQueue(label: "com.example.concurrentQueue", attributes: .concurrent)

concurrentQueue.async {
task1()
}

concurrentQueue.async {
task2()
}
  • Dispatch Groups: Dispatch groups allow you to group multiple tasks and wait for them to complete. You can use DispatchGroup to manage synchronization and perform actions after all tasks in the group have finished.
// Dispatch Group

func task1(completion: @escaping () -> Void) {
print("Task 1 started")
Thread.sleep(forTimeInterval: 1)
print("Task 1 finished")
completion()
}

func task2(completion: @escaping () -> Void) {
print("Task 2 started")
Thread.sleep(forTimeInterval: 1)
print("Task 2 finished")
completion()
}

let group = DispatchGroup()

// Add tasks to the group
group.enter()
task1(completion: {
group.leave()
})

group.enter()
task2(completion: {
group.leave()
})

// Wait for tasks to complete
group.wait()

// Perform action after all tasks are done
print("All tasks are done")
  • Dispatch Semaphore: Dispatch Semaphore allow you to control threads that are used to access shared resources.
// Dispatch Semaphore

let queue = DispatchQueue(label: "com.example.concurrentQueue", attributes: .concurrent)
let semaphore = DispatchSemaphore(value: 3) // Maximum thread is 3

for i in 1...10 {
queue.async {
semaphore.wait()
print("Task \(i) start")
Thread.sleep(forTimeInterval: 1) // Wait for 1 second
print("Task \(i) finish")
semaphore.signal()
}
}

In the code above, the maximum threads value is 3. So, task 4 will be started when at least one of the three preceding tasks has been completed.

Operation Queue

Operation Queue is a higher-level abstraction built on top of GCD. It allows you to manage and execute operations concurrently or sequentially. Operations can be subclasses of Operation or closures wrapped in BlockOperation.

// Operation Queue

let task1 = BlockOperation {
print("Task 1 started")
Thread.sleep(forTimeInterval: 1)
print("Task 1 finished")
}

let task2 = BlockOperation {
print("Task 2 started")
Thread.sleep(forTimeInterval: 1)
print("Task 2 finished")
}

let task3 = BlockOperation {
print("Task 3 started")
Thread.sleep(forTimeInterval: 1)
print("Task 3 finished")
}

// Task 1 will be executed after Taks 2 finished
task1.addDependency(task2)

// Create operation queue
let queue = OperationQueue()

// Task added to the operation queue will be executed as soon as possible
queue.addOperation(task1)
queue.addOperation(task2)
queue.addOperation(task3)
queue.waitUntilAllOperationsAreFinished()

print("All tasks are done")

Operation queues provide additional features like dependencies between operations, maximum concurrent operation count, and cancellation support.

DispatchQueue.main

In iOS, the main queue is a special serial queue associated with the main thread. It’s responsible for handling UI-related tasks. You should always perform UI updates and other UI-related work on the main queue to ensure thread safety and avoid UI inconsistencies.

DispatchQueue.global().async {
// Perform work on a background thread

DispatchQueue.main.async {
// Update UI on the main thread
}
}

Thread

Although GCD and Operation Queue are the preferred options for multithreading in Swift, you can also work directly with threads using the Thread class. However, managing threads directly is more complex and error-prone compared to GCD or Operation Queue.

Problems

However, multithreaded operations can cause race conditions and deadlocks if proper synchronization mechanisms are not in place.

Race Condition

It occurs when multiple threads access and modify shared data concurrently without proper synchronization. This can lead to unpredictable and inconsistent outcomes, data corruption, or crashes.

// Race Condition

import Foundation

var counter = 0

func incrementCounter() {
for _ in 0..<100 {
counter += 1
}
}

// Create two concurrent queues
let queue1 = DispatchQueue(label: "queue1", attributes: .concurrent)
let queue2 = DispatchQueue(label: "queue2", attributes: .concurrent)

// Dispatch the incrementCounter function on both queues simultaneously
queue1.async {
print("Queue1 run")
incrementCounter()
}

queue2.async {
print("Queue2 run")
incrementCounter()
}

print("Counter value: \(counter)")

In this code, there is a shared counter variable that is accessed and modified by two concurrent queues (queue1 and queue2). The incrementCounter function increments the counter variable by 1 multiple times.

Due to the concurrent execution of the queues without proper synchronization, a race condition occurs. As a result, the final value of the counter variable may not be what is expected. Running this code multiple times may yield different results each time, demonstrating the unpredictable nature of race conditions.

To resolve the race condition and ensure consistent results, you can introduce synchronization mechanisms such as locks or serial queues to ensure exclusive access to the counter variable.

// Race Condition (Fixed)

import Foundation

var counter = 0

func incrementCounter() {
for _ in 0..<100 {
counter += 1
}
}

// Create two concurrent queues
let queue1 = DispatchQueue(label: "queue1", attributes: .concurrent)
let queue2 = DispatchQueue(label: "queue2", attributes: .concurrent)

// Dispatch the incrementCounter function on both queues simultaneously
queue1.async {
print("Queue1 run")
incrementCounter()
}

queue2.async {
print("Queue2 run")
incrementCounter()
}

// Synchronize tasks
queue1.sync(flags: .barrier) {}
queue2.sync(flags: .barrier) {}

print("Counter value: \(counter)")

Deadlock

It happens when threads are blocked, waiting for each other to release resources. As a result, the program comes to a halt. Deadlocks occur due to circular dependencies or improper synchronization.

// Deadlock

import Foundation

let lockA = NSLock()
let lockB = NSLock()

func performTask1() {
lockA.lock()
print("Task 1 acquired lock A")

// Simulate some processing time
Thread.sleep(forTimeInterval: 1)

lockB.lock()
print("Task 1 acquired lock B")

// Perform task with both locks acquired

lockB.unlock()
print("Task 1 released lock B")

lockA.unlock()
print("Task 1 released lock A")
}

func performTask2() {
lockB.lock()
print("Task 2 acquired lock B")

// Simulate some processing time
Thread.sleep(forTimeInterval: 1)

lockA.lock()
print("Task 2 acquired lock A")

// Perform task with both locks acquired

lockA.unlock()
print("Task 2 released lock A")

lockB.unlock()
print("Task 2 released lock B")
}

let queue = DispatchQueue(label: "com.example.deadlock", attributes: .concurrent)
queue.async {
performTask1()
}

queue.async {
performTask2()
}

// Wait for the tasks to complete
queue.sync(flags: .barrier) {}

print("Tasks complete!")

In the above example, two tasks (performTask1 and performTask2) are executed concurrently on separate threads. Each task tries to acquire two locks (lockA and lockB) in a different order. This creates a potential deadlock scenario.

If performTask1 acquires lockA and then gets preempted by the system, and performTask2 acquires lockB in the meantime, both tasks will be waiting for each other to release the locks they hold. This results in a deadlock, where neither task can proceed, leading to a program that hangs indefinitely.

To avoid deadlocks, it’s crucial to ensure that locks are always acquired in a consistent order across different parts of your code. If the locks need to be acquired in a different order, it’s necessary to restructure the code or employ deadlock avoidance techniques, such as using timeouts or detecting circular dependencies.

// Deadlock (Fixed)

import Foundation

let lockA = NSLock()
let lockB = NSLock()

func performTask1() {
lockA.lock()
print("Task 1 acquired lock A")

// Simulate some processing time
Thread.sleep(forTimeInterval: 1)

lockB.lock()
print("Task 1 acquired lock B")

// Perform task with both locks acquired

lockB.unlock()
print("Task 1 released lock B")

lockA.unlock()
print("Task 1 released lock A")
}

func performTask2() {
lockB.lock()
print("Task 2 acquired lock B")

// Simulate some processing time
Thread.sleep(forTimeInterval: 1)

lockA.lock()
print("Task 2 acquired lock A")

// Perform task with both locks acquired

lockA.unlock()
print("Task 2 released lock A")

lockB.unlock()
print("Task 2 released lock B")
}

let queue = DispatchQueue(label: "com.example.deadlock", attributes: .concurrent)
queue.async {
performTask1()
}

// Wait for the first task to complete
queue.sync(flags: .barrier) {}

queue.async {
performTask2()
}

// Wait for the second task to complete
queue.sync(flags: .barrier) {}

print("Tasks complete!")

Thread-Safe Operations

Thread-safe operations are essential in scenarios where multiple threads or concurrent access can occur to shared resources. Let’s consider a real-life case to understand when thread-safe operations are needed in Swift:

Example: Shopping Cart in an E-commerce App

In an e-commerce app, multiple users may add items to their shopping carts simultaneously. To handle this, the shopping cart should be implemented with thread safety to prevent race conditions and ensure data integrity. Here’s how you can achieve that:

class ShoppingCart {
private var items: [CartItem] = []
private let accessQueue = DispatchQueue(label: "com.example.shoppingCart", attributes: .concurrent)

// Add an item to the shopping cart
func addItem(_ item: CartItem) {
accessQueue.async(flags: .barrier) {
self.items.append(item)
}
}

// Remove an item from the shopping cart
func removeItem(_ item: CartItem) {
accessQueue.async(flags: .barrier) {
if let index = self.items.firstIndex(of: item) {
self.items.remove(at: index)
}
}
}

// Retrieve the current items in the shopping cart
func getItems() -> [CartItem] {
var retrievedItems: [CartItem] = []
accessQueue.sync {
retrievedItems = self.items
}
return retrievedItems
}
}

In the code above, the shopping cart is implemented as a class with a private items array to store the cart items. The accessQueue is a concurrent queue used for thread safety.

  • When adding an item (addItem(_:)) or removing an item (removeItem(_:)) from the shopping cart, the accessQueue is used with the .barrier flag to ensure that the add or remove operations are performed exclusively and sequentially. This prevents multiple threads from modifying the items array simultaneously, preserving data integrity.
  • When retrieving the items (getItems()), the accessQueue is used with the .sync dispatch to ensure that the retrieval operation is performed synchronously. This guarantees that the items array is read safely without potential conflicts caused by concurrent modifications.

By making the shopping cart implementation thread-safe, you ensure that multiple users can concurrently modify their shopping carts without data corruption or inconsistent results. The use of a concurrent queue with appropriate synchronization ensures that only one thread can access or modify the shared items array at a time, preventing race conditions and maintaining data consistency.

While DispatchQueue is the preferred choice for most concurrency scenarios in Swift, there are cases where NSLock can still be useful. Here's a real-life example where NSLock may be applicable:

Example: Bank Account Transactions

Consider a banking application that allows users to perform transactions on their bank accounts concurrently. Each transaction involves reading the current account balance, performing some calculations, and updating the balance. In this scenario, thread safety is crucial to avoid race conditions and ensure accurate balance updates.

Using NSLock, you can synchronize access to the account balance and ensure that only one thread modifies it at a time:

class BankAccount {
private var balance: Double = 0.0
private let lock = NSLock()

func deposit(amount: Double) {
lock.lock()
balance += amount
lock.unlock()
}

func withdraw(amount: Double) {
lock.lock()
balance -= amount
lock.unlock()
}

func getBalance() -> Double {
lock.lock()
let currentBalance = balance
lock.unlock()
return currentBalance
}
}

In the above code, the BankAccount class uses NSLock to synchronize access to the balance property. The lock is acquired before modifying or reading the balance and released afterward to allow other threads to access it.

In this banking application, multiple concurrent threads can call the deposit(), withdraw(), and getBalance() methods. By using NSLock, you ensure that only one thread can modify or access the balance at a time, preventing race conditions and guaranteeing accurate account transactions.

Although DispatchQueue can achieve the same result, NSLock may be a suitable choice in scenarios where you require more granular control over locking and unlocking or when integrating with existing Objective-C code that already uses NSLock.

When considering the use of NSLock, be mindful of the potential drawbacks, such as the need for manual locking and unlocking operations and the possibility of deadlocks if not used correctly. In general, it is recommended to evaluate DispatchQueue first and resort to NSLock when specific requirements or constraints warrant its usage.

And thats it! Thank you for reading! Feel free to ask any questions or provide feedback in the comments section.

--

--