Concurrency Visualized — Part 2: Serial vs Concurrent

Besher Al Maleh
5 min readJan 29, 2020

--

This is part 2 of my concurrency series. If you missed part 1, check it out here.

In part 1, we explored the differences between synchronous and asynchronous execution when dispatching tasks using GCD. Now, we’re going to focus on what happens in the queue after you dispatch your task.

Serial vs Concurrent

Serial and concurrent affect the destination — the queue in which your work has been submitted to run. This is in contrast to sync and async, which affected the source.

A serial queue will not execute its work on more than one thread at a time, regardless of how many tasks you dispatch on that queue. Consequently, the tasks are guaranteed to not only start but also terminate, in first-in, first-out order. Moreover, when you block a serial queue (using a sync call, semaphore, or some other tool), all work on that queue will halt until the block is over.

From Dispatcher on Github

A concurrent queue can spawn multiple threads, and the system decides how many threads are created. Tasks always start in FIFO order, but the queue does not wait for tasks to finish before starting the next task, therefore tasks on concurrent queues can finish in any order. When you perform a blocking command on a concurrent queue, it will not block the other threads in this queue. Additionally, when a concurrent queue gets blocked, it runs the risk of thread explosion. I will cover this in more detail later on.

From Dispatcher on Github

The main queue in your app is serial. All the global pre-defined queues are concurrent. Any private dispatch queue you create is serial by default but can be set to be concurrent using an optional attribute as discussed in part 1.

It’s important to note here that the concept of serial vs concurrent is only relevant when discussing a specific queue. All queues are concurrent relative to each other. That is, if you dispatch works asynchronously from the main queue to a private serial queue, that work will be completed concurrently with respect to the main queue. And if you create two different serial queues, and then perform blocking work on one of them, the other queue is unaffected.

To demonstrate the concurrency of multiple serial queues, let’s take this example:

Both queues here are serial, but the results are jumbled up because they execute concurrently in relation to each other. The fact that they’re each serial (or concurrent) has no effect on this result. Their QoS level determines who will generally finish first (order not guaranteed.)

If we want to ensure the first loop finishes first before starting the second loop, we can submit the first task synchronously from the caller:

This is not necessarily desirable, because we are now blocking the caller while the first loop is executing.

To avoid blocking the caller, we can submit both tasks asynchronously, but to the same serial queue:

Now our tasks execute concurrently with respect to the caller, while also keeping their order intact.

Note that if we make our single queue concurrent via the optional parameter, we go back to the jumbled results, as expected:

Sometimes you might confuse synchronous execution with serial execution (at least I did), but they are very different things. For example, try changing the first dispatch on line 3 from our previous example to a sync call:

This can be misleading

Suddenly, our results are back in perfect order, but this is a concurrent queue, so how could that happen? Did the sync statement somehow turns it into a serial queue?

The answer is no!

This is a bit sneaky. What happened is that we did not reach the async call until the first task had completed its execution. The queue is still very much concurrent, but inside this zoomed-in section of the code, it appears as if it were serial. This is because we are blocking the caller, and not proceeding to the next task until the first one is finished.

If another queue somewhere else in your app tried submitting work to this same queue while it was still executing the sync statement, that work will run concurrently with whatever we got running here because it’s still a concurrent queue.

Which one to use?

Serial queues take advantage of CPU optimizations and caching, and help reduce context switching. Apple recommends starting with one serial queue per subsystem in your app — for example one for networking, one for file compression, etc. If the need arises, you can later expand to a hierarchy of queues per subsystem using the setTarget method or the optional target parameter when building queues.

If you run into a performance bottleneck, measure your app’s performance, then see if a concurrent queue helps. If you do not see a measurable benefit, it’s better to stick to serial queues.

— End of Part 2—

Check out Part 3 here:

If you have any questions or comments, feel free to reach out to me on Twitter

Thanks for reading. If you enjoyed this article, feel free to hit that clap button 👏 to help others find it. If you *really* enjoyed it, you can clap up to 50 times 😃

Check out some of my other articles:

--

--