Concurrency in iOS: GCD

Tifo Audi Alif Putra
Geek Culture
Published in
9 min readSep 8, 2022

This would be my first article that talk about concurrency in iOS

Photo by Nicolas Messifet on Unsplash

Introduction to Concurrency

Concurrency is one of the most important topics that iOS developer need to know. We can use concurrency to create smooth, reliable, and responsive application. Your user will rarely appreciate you when your app is perform well. But when your app has a laggy performance, your user will send complaints and even it will have a bad impact to your app rating on the App Store. So on these day, understanding concurrency is a must-have skill even for junior level.

Referring from Wikipedia, concurrency is the ability of different parts or units of a program, algorithm, or problem to be executed out-of-order or in partial order, without affecting the final outcome. The easy definition would be the ability to how computer dealing to execute one or more tasks at the same time.

However in the single core device, it use a technique called time-slicing to perform concurrency. It run on the one thread, and then perform context switching to another thread, and at some point would back again to the previous thread. This technique create an illusion that the two thread is being executed in the same time. In the multi-core devices, they can achieve something called parallelism which can execute more than one tasks in literally at the same time. Take a look to this simple illustration:

The key difference between them is concurrency dealing with multiple tasks at the same time, and parallelism is about doing multiple tasks at the same time.

Concurrency in iOS

To achieve concurrency in iOS, there are two low-level libraries that Apple actually recommends us to use which are GCD and Operation Queue. What are those? How can we pick the best library to implement concurrency? we will discuss about GCD first in this article

Grand Central Dispatch

GCD or Grand Central Dispatch is a library made by Apple to manage concurrent operation. It use queue concept, that mean the first operation come to the queue would be executed first before the others.

There are two types of queue which are serial queue and concurrent queue. Serial queue make sure that all tasks inside the queue would be executed one by one, waiting for current task completion, then it will continue execute the next task in a consecutive order.

Concurrent queue, on the other side will execute all tasks one by one, but without waiting the completion of the current task, it will continue to executed the next task directly.

Using concurrent queue will make the process inside our app much faster than serial queue. But keep in mind when using the concurrent queue, we can’t control the order of the tasks completion. It all depend on the operating system, there are so many factors like latency, scheduling algorithm, etc. Opposite from concurrent queue, serial queue is much easier to predict the output because it will guarantee to waiting the current completion task before continue executing the next task.

In GCD itself we can create a dispatch queue object by ourself, or we can use either main queue or global queue already provided by system. All tasks that dispatched using the main queue would be executed inside the main thread, and yes it is serial queue and you must make sure all of your UI activities is run on the main thread. Otherwise it would be crashed. Global queue is concurrent queue and it is managed by the operating system. When using the global queue, the system will decide the priority of the task using something called quality of services (QoS).

There are four quality of services that we can use when dispatch task to the global queue:

  1. User-Interactive
    Task that using .userInteractive quality of service require a small load task that need to finish immediately, something like animation or UI updated.
  2. User-Initiated
    Task that using .userInitiated quality of service is a UI async task initiated by user. Use this quality of service class when you need to waiting for immediate result from the async task and need to continue the UI interaction. The system will make it high priority on the global queue. For example, use this quality of service when you need to perform apply complex filter to image.
  3. Utility
    Task that using .utility quality of service usually need to provide an activity indicator in the UI side like networking, or perform long-computational code. The system will execute this task with low priority on the global queue.
  4. Background
    This quality of service is used to execute task which user doesn’t have to know about the progress or the result like perform maintenance , clean up resource, etc. The system will execute this task with background priority on the global queue.

We can dispatch a function or an action regarding of the queue that we choose either synchronously or asynchronously. Synchronous mean that the compiler will return the control after the current task is finished.

for i in 0...5 {
print(i)
}
print("looping is finished")
// Output
0
1
2
3
4
5
looping is finished

The code above is example of synchronous process where the print statement of “looping is finished” is executed after the looping is finished.

Asynchronous process will make the compiler return the control immediately without waiting the current task is finished.

URLSession.shared.dataTask(url: url) { data, response, error in
print(response)
}.resume()
print("is fetching")

From the code example above, the print statement “is fetching” will be executed directly without waiting the session to completely fetch the data from the server.

Serial queue and concurrent queue are totally different from synchronous and asynchronous. Keep in mind when we dealing with terms like serial or concurrent, it is actually related to which thread that system will use to execute the task. While sync and async is about on the queue execution level to determine whether we should wait the current task completion or not.

DispatchQueue

When using GCD inside your app, you will use the DispatchQueue object to manage the operations. Like the previous explanation, we can use main queue, global queue, or we can create a custom queue.

// perform reload data on the main queue, will executed on the main threadDispatchQueue.main.async {
self.tableView.reloadData()
}

When you want to using the main queue, you can directly call the static property from the DispatchQueue and you can dispatch an action either synchronously or asynchronously. From the sample code above, we want to perform reload data from the table view on the main queue and it will dispatched asynchronously, it mean we doesn’t need to wait the reload data to be completed first.

Always remember, when you working on the main queue do not dispatch an action synchronously. If you do that, you are blocking the main thread and most likely there would be a deadlock situation, and your app would be crash.

func applyFilter(to inputImage: UIImage) {
DispatchQueue.global(qos: .userInitiated).async {
let finalImage = SomeFilter.apply(inputImage)

DispatchQueue.main.async {
self.imageView.image = finalImage
}
}
}

Similar to main queue, you can call the global static property from DispatchQueue and perform an action either synchronously or asynchronously. At the code example above, we are using .userInitiated quality of service to perform filter operation outside the main thread, with this approach our app will perform smoothly and responsive. When the final image is ready, don’t forget to dispatch the UI update back to the main thread, by using main queue in DispatchQueue.

// serial queuelet serialQueue = DispatchQueue(label: "com.example-serialQueue")
serialQueue.async {
print("1")
}
serialQueue.async {
print("2")
}
// concurrent queuelet concurrentQueue = DispatchQueue(label: "com.example-concurrentQueue", attributes: .concurrent)concurrentQueue.async {
print("3")
}
concurrentQueue.async {
print("4")
}

And finally, we can create our custom queue. By default, the queue that you create would become the serial queue unless you passing the .concurrent attribute to the initializer.

DispatchWorkItem

As you can see the previous code example when using DispatchQueue is always using closure to which contain some action that we create. There is another way to add an action to the DispatchQueue, which is using DispatchWorkItem.

let workItem = DispatchWorkItem {
print("1")
}
DispatchQueue.main.async(execute: workItem)

It’s a pretty straightforward, at the end you will passing a closure to the DispatchWorkItem. Then you will put the work item as an argument to the DispatchQueue function. The benefit of using DispatchWorkItem is you can cancel the action.

var searchWorkItem: DispatchWorkItem?func search(_ query: String) {
searchWorkItem?.cancel()
let workItem = DispatchWorkItem {
// perform search
}

searchWorkItem = workItem
DispatchQueue.main.asyncAfter(deadline: .now() + 0.35, execute: searchWorkItem)
}

Concurrency Problem

While we may get a lot of benefits by achieving concurrency, there is one main enemy of concurrency, yes it is shared mutable state. Remember, concurrency is about dealing with multiple tasks at the same time and because of that we can’t predict the exact outputs from the concurrent operation. Take a look at this sample code.

var someValue: String = ""
let concurrentQueue = DispatchQueue(label: "com.domain.concurrentQueue", attributes: .concurrent)
concurrentQueue.async {
someValue = "A"
}
concurrentQueue.async {
someValue = "B"
}
concurrentQueue.async {
someValue = "C"
}
print(someValue)

We have a shared mutable state called someValue hold a string value. Then we create concurrent queue, and do multiple write value asynchronously. When you see the output, most likely will print “C”. But, if you run multiple times then you may find that it will print “B” instead of “C”. So there is inconsistency data here because of race condition. In a real world case, it is really hard to reproduce the crash issue because of race condition, this is one of the most problem occurred because of concurrency.

The definition of thread-safe, at least my opinion, is about where shared mutable state is safe being accessed from multiple thread. So, there are is a couple ways to make a shared mutable state become thread-safe. First instead of dispatch an action asynchronously, we can use synchronous operation to prevent the race condition. Remember, with synchronous operation we will wait the current operation until finish and we can go to the next operation.

var someValue: String = ""
let concurrentQueue = DispatchQueue(label: "com.domain.concurrentQueue", attributes: .concurrent)
concurrentQueue.sync {
someValue = "A"
}
concurrentQueue.sync {
someValue = "B"
}
concurrentQueue.sync {
someValue = "C"
}
print(someValue) // C

Another way to prevent the race condition, we can use DispatchBarrier. DispatchBarrier will make an action as a barrier block when being executed on the concurrent queue.

For example if we make a task 2 dispatched to concurrent queue with dispatch barrier flag, it will block another task until task 2 finished, then the execution will continue to the next task. You can imagine if using dispatch barrier will make the concurrent queue a bit similar with serial queue.

var someValue: String = ""
let concurrentQueue = DispatchQueue(label: "com.domain.concurrentQueue", attributes: .concurrent)
concurrentQueue.async(flags: .barrier) {
someValue = "A"
print("first write operation: \(someValue)")
}

Where to go from here

Congratulation for following this article. I hope now you have a basic foundation skill about concurrency and specifically GCD in iOS. You can explore more about some uncovered topics like DispatchGroup, when you need to grouping some tasks, or you may want to learn about DispatchSemaphore, when you need to limit the thread of execution. Concurrency is a huge topic, you will need a time to take all of this and put it inside your brain. Thank you and see you in another article — probably will talking about OperationQueue :].

--

--