Grand Central Dispatch — How Terminal! (Swift 3)

Erica Millado
Yay It’s Erica
Published in
6 min readDec 22, 2016
The other “Grand Central”…Station, that is.

Ah, multi-threading, concurrency, and the like. These are topics that at Flatiron School, we were told “this is really important” and now that I am starting to job hunt, read job descriptions and go on technical interviews, am now realizing that Flatiron wasn’t lying.

So here we go, let’s focus on one thing: Grand Central Dispatch (GCD).

Grand Central Dispatch (the framework formerly known as libdispatch) is Apple’s framework for incorporating concurrency (management of multiple things happening at the same time) into your applications. Basically, GCD takes care of managing queues (it does all the scheduling of tasks for us!) for us so we don’t have to think about how we can incorporate threads or where to “lock” lines of our code. GCD allows us (developers) to think about writing code as a single thread or process and as a result, we only need to think of where to break up parts of our code to run on another thread.

Apple documentation states, “GCD provides and manages FIFO (first in first out) queues to which your application can submit tasks in the form of block objects. Work submitted to dispatch queues are executed on a pool of threads fully managed by the system. No guarantee is made as to the thread on which a task executes.”

In a nutshell, GCD is our little helper because it:

  • helps our app performance because it runs long-running tasks in the background.
  • makes things easier for developers by avoiding the need for setting locks and detaching code on new threads (which can lead to concurrency bugs, if not done correctly)
  • can enhance your code with better performance for things such as singletons.

So, let’s just take a step back and review some vocabulary and concepts that need to be understood before we start discussing GCD:

  • Task = a single piece of work (i.e. API call) that needs to be completed.
  • Threads = a process provided by the operating system that allows multiple sets of instructions (aka lines of code) to operate at the same time within a single application.
  • Process = an executable portion of code, possibly amongst multiple threads.

Cool, now that we’ve gotten those terms down, let’s talk about ways that tasks can be executed.

Serial Dispatch Queue vs. Concurrent Dispatch Queue

If a task is executed one at a time, it is said to be executed serially. If multiple tasks can be executed at the same time, they are said to be executed concurrently.

Grand Central Dispatch provides dispatch queues (think of a “queue” as a line) that allow you to run blocks of code concurrently, on what is called a concurrent dispatch queue. With a concurrent dispatch queue, we have multiple threads available to processes multiple blocks of code at the same time. This type of queue differs from a serial dispatch queue, which adds only one piece of work at a time on one thread.

Serial vs. concurrent dispatch queues

Synchronous vs. Asynchronous Execution

Functions can either be synchronous or asynchronous. Synchronous functions only return after a previous task has completed. I like to think of it as in “sync”(Fun fact: NSYNC is my favorite boy band of all time). Asynchronous means it runs immediately, it doesn’t wait for a previous task to finish before getting started. Hence, it’s not in sync (I like to think of asynchronous as the Backstreet Boys to my NSYNC).

Grand Central Dispatch : How do we use it?

As I mentioned earlier, GCD gives us dispatch queues that manage blocks of our code. These dispatch queues are what handle the tasks we provided to GCD and are processed in FIFO (first in first out) order (the first task is placed first in the queue, and so on).

You recall that in a serial dispatch queue, only one block of code is executed at a time. Under GCD, the timing of execution of these blocks is under GCD’s control. In other words, I have no idea how long it’s going to take for blocks 1, 2 and 3 of code to run but I know two things: 1) each block is going to run one at a time and 2) they’re going to be run in the order that they were added.

For a concurrent dispatch queue, multiple blocks of code can be executed at the same time. What GCD does is decide when to start each block. Since these are multiple blocks of code running at the same time and overlapping execution of blocks of code will happen, GCD decides when one of these blocks should run on a different core (a core is one of your computer’s processors. My Macbook Pro laptop has two, hence, dual-core. Think of 2 cores = 2 threads!). GCD figures out if a core (thread) is available and can switch a block of code on and off on these available threads to make your code run efficiently. So, if you’re thinking, “Wait, I don’t have to figure out which thread to put this code on, GCD will do this efficiently for me?” you’re right.

Types of Queues

Another thing to note are the different types of queues that GCD manages for us. It’s important to understand the types of queues when talking about GCD.

By default, everything we execute is on the “main thread” or the “main queue.” This “main” queue is a serial queue (run one at a time, aka synchronously). It is important to understand that the main queue handles anything drawn on your screen (UI views), responding to mouse events, touch events, and keyboard events.

In terms of concurrent queues (otherwise known as Global Dispatch Queues), there are four options within their own Quality of Service class: user interactive, user initiated, utility and background. (You can also create your own custom queues -serial or concurrent- but that’s another blog post :-)

QOS_CLASS_USER_INTERACTIVE = use this concurrent queue for tasks that need to be done immediately (UI updates, handing events). You should be using this queue minimally.

QOS_CLASS_USER_INITIATED = use this concurrent queue for tasks that a user initiates from the UI. You should use this if a user expects a response from their action (a button is tapped, etc.)

QOS_CLASS_UTILITY = use this concurrent queue for long-running tasks (i.e. networking, computations). You should use this with an activity (progress) indicator, so that users know they are waiting.

QOS_CLASS_BACKGROUND = use this concurrent queue for tasks that can be on the background (aka the user doesn’t need to know about these tasks). You should use this for tasks that don’t really involve user interaction and aren’t immediately needed.

What if I want to force GCD to put something on the “main queue”?

Good question. Well, you can simply tell GCD to put a block of code on the main in an asynchronous way:

Lines 65–67: I take my completion handler (with data) and add it to the main queue here.

To do this, just take the block of code (in my case above, a completion handler) and call a DispatchQueue.main.async { }.

There is so much more to the world of concurrency, NSOperation, multi-threading and GCD, but this is a brief overview of what Grand Central Dispatch does and how you can force GCD to put something on the main thread.

Enjoy queueing up your code!

Resources:

GCD — Apple Documentation

Grand Central Dispatch Tutorial — Ray Wenderlich

--

--