Multithreading in iOS applications

Mohammed ElDoheiri
5 min readFeb 2, 2019

--

Cover the basics of multithreading in an iOS Application

Multithreading in iOS application development is one of the most difficult topics that produce a lot of headaches for application developers, from the concept of threads to dispatch queues to deadlocks to managing access to a shared resource from multiple threads, and there are very easy ways to screw things up and end up with very difficult to debug or even detect bugs that appears in indeterministic ways and times.

Of course we cannot cover every thing related to multithreading in one post (neither do i have the knowledge and experience to do so :-)), but we can at least cover the basics of the different concepts mentioned above.

Let’s start first with what is a thread.

A thread in an application process have only one call stack and one program instruction counter and shares the same address space as the enclosing process (each application has its own address space where the program binary instructions, heap, stack(s), and thread(s), live, all those collectively are called an application process), and since a thread has only one call stack and one instruction counter that means it can only handle instructions serially, one instruction after the other, so if you have only one thread doing all the work in the application, then you don’t have to worry at all about accessing a global resource, because access to it is guaranteed to be done sequentially one instruction after the other, and you will never end up with weird multithreading bugs like race conditions and deadlocks, but of course if your application is doing heavy processing, then your whole application including the UI will be very slow and you won’t be taking full advantage of multicore processors.

So in order to be able to make the UI smooth, you need to offload your heavy processing code into another thread so that you can achieve the 60fps (or close) frame rate, and your application can feel smooth and your animations crispy.

So … what’s a dispatch queue then?.

Well, creating and maintaining threads manually by developers and doing all the synchronization between them is a very complicated and heavy task to do, so Apple created an abstraction layer that do all the heavy work for you, all you need to do is prepare the block of code that needs to be run in a background thread and dispatch it to the ready made queues along with any configuration that you need and it will just do it for you.

Sounds very easy, right? no, you still need to understand what you’r doing, because you can still run into race conditions, and deadlocks, to say the least.

GCD dispatch queues

GCD dispatch queues comes in a couple of flavours, vanilla, and mango… just kidding! :-D.

Serial and Concurrent.

Serial queues execute the units of work dispatched to them in a serial manner, the current task has to return before the next one executed, but you shouldn’t assume that all the units of work will be executed on one thread, the queue can for optimization reasons use one thread for one task and another thread for another task, be aware of this!-except for the main application queue which is a serial queue that uses only the main application thread-, and unfortunately until now you cannot control what threads the serial queue uses, the concept of threads is completely abstracted away from the developers, of course you can create threads your self, but you shouldn't do this unless you really know what you’re doing!.

Concurrent queues on the other hand, execute all the tasks concurrently, so there is no guarantee at all in which order they will be finished, or on which threads they will be executed like serial queues, so this kind of queue is very good for executing multiple tasks that has no dependency on each other (although they may access the same resource which means access to this shared resource needs to be thread safe), and there are ways to wait for all of them to finish using dispatch groups.

There is a feature in concurrent queues called barrier block, a barrier block means dispatching a task to the concurrent queue but you want all previously dispatched task to finish first before executing the barrier task, and all subsequently dispatched task to wait until the barrier task is finished, so if you dispatch all tasks to a concurrent queue as barrier tasks, you’re effectively converted the concurrent queue to a serial queue!.

Dispatching to queues -whether serial or concurrent- comes in two flavours too, synchronously, and asynchronously.

If you dispatch a task synchronously from the main thread for example to a dispatch queue, the main thread will be blocked and will wait until the task dispatched has returned -which can be dangerous as explained in the next section-, but if you dispatch it asynchronously, the main thread will continue on its way and won’t wait, for example if you have a fire and forget kind of task, like send an event to the backend API.

Deadlocks

The last concept in this post is what’s a deadlock!.

A deadlock is when one thread A is waiting for thread B, and thread B is waiting for thread A, A and B can be the same thread waiting for itself!.

For example if you dispatch a task synchronously to a serial queue inside a function called readFromFile(), and inside this task you call recursively readFromFile() again!, which dispatches another task synchronously, which means the first task cannot finish until the second task has finished, and the second task cannot be executed until the first task has finished -because its a serial queue-!.

Conclusion

I tried as best as i can to explain the basics of whats a thread, whats a dispatch queue and their types, and whats a deadlock, but the topic is far from over, multithreading is a very big subject that can fill books, and in the next post i will cover in specifics how to use dispatch queues to make access to a shared resource thread safe.

--

--