All you need to know about GCD in Swift

G. Abhisek
7 min readOct 1, 2022

Concurrency is an important programming concept that helps developers better utilize the hardware capabilities of a device to execute their programs efficiently. With the introduction of async-await Swift has entered into a new era of concurrent programming. Besides, async-await there are other constructs that could help you write concurrent code in Swift. They are GCD, Operation Queues, and NSThreads.

If you are new to Swift you might be tempted to learn the brand new async await APIs. However a majority of the codebase still uses GCD for concurrent programming, and hence learning concepts of GCD becomes more important. In this blog, we will discuss the most important APIs of GCD.

This blog assumes you are aware of programming concepts such as concurrency, parallelism, synchronous, asynchronous, and threads.

If you still want to revisit and study a bit more in detail, you can check out my blog, Concurrency, Parallelism, Threads, Processes, Async, and Sync — Related? 🤔.

What is GCD?

GCD or Grand Central Dispatch is a low-level API to manage concurrency in Swift that was introduced back in iOS 4. It manages a collection of threads underneath. GCD provides APIs to achieve concurrency with the help of queues and these queues are known as the Dispatch Queues.


DispatchQueue is an interface that represents queues to schedule tasks on various threads available with the system. Queues can either be serial or concurrent:

Serial Queue

As the name suggests, the serial queue runs tasks serially one after the other in the order they are submitted.

A serial queue ensures a single task is executed at a given instance of time.

Concurrent Queue

  • Concurrent queues help us run multiple tasks at the same time.
  • GCD controls when a particular task in a concurrent queue should start and when and how threads and cores would be used.
  • GCD ensures the tasks are started in the order they are submitted.
  • You can not predict the order of completion of tasks submitted to a concurrent queue.

Tasks to a queue can be submitted asynchronously or synchronously. This brings us to another less understood topic of Async vs Sync.

Asynchronous vs Synchronous execution

  • In a synchronous programming model, tasks are executed one after another. Each task waits for any previous task to be completed and then gets executed.
  • In an asynchronous programming model, when one task gets executed, you could switch to a different task without waiting for the previous one to get completed.

When you call .sync on a queue, the following happens:

  • You ask the queue to execute the submitted task synchronously.
  • The calling thread is made to wait until the dispatched task has completed execution.

When you call .async on a queue, the following happens:

  • You ask the queue to execute the submitted task asynchronously.
  • The calling thread will not wait for the dispatched task to finish.

GCD provides you with three major types of queues: Main Queue, Global Queue, and Custom Queue which fall broadly into Serial or Concurrent queues.

Main Queue

  • Main queue is a serial queue that runs on the main thread.
  • Main thread is responsible for handling UI events and interactions. Given this fact, the main queue becomes the default and obvious choice to perform UI updates.
  • You should not call sync updates on the main thread as it will block the execution of other UI events, it being a serial queue and this would result in a crash due to deadlock.

Global Queue

  • Global queues are a group of queues that are available to the system for the execution of tasks.
  • There are four types of Global queues in order of their priorities: High, Default, Low, and Background.

Custom Queue

GCD gives you the capability to create custom queues, that can be either serial or concurrent to handle concurrent operations.

Understanding a DispatchQueue initializer

The initializer of a DispatchQueue has the following signature:

init(label: String, qos: DispatchQoS = .unspecified, attributes: DispatchQueue.Attributes = [], autoreleaseFrequency: DispatchQueue.AutoreleaseFrequency = .inherit, target: DispatchQueue? = nil)
  • label: label is a string identifier that helps us to uniquely identify the queue in various instrumentation tools.
  • qos: When you are invoking any Global queue you specify the QoS of the queue that you want to invoke. We do not directly specify the priorities but do that in the form of QoS classes.
  • attributes: You can specify the attributes for the queue, whether the queue is .serial or .concurrent. If you do not specify any attributes, the queue is a serial queue.
  • autoreleaseFrequency: The frequency at which the objects created by the dispatch queue to schedule your submitted tasks need to be autoreleased.
  • target: The target queue on which the dispatched tasks will execute. Eg. Queue A has a target queue as B. All tasks submitted to queue A will be run on queue B. You can follow this wonderful stack overflow thread to gain more insights on this.

Declaring a serial queue:

let serialQueues = DispatchQueue(label: "serial-queue")

Declaring a concurrent queue:

let concurrentQueue = DispatchQueue(label: "concurrent-queue",attributes: .concurrent)

Quality of Service (QoS)

QoS class constructs are used to specify the priority of tasks that you want to schedule. There are the following predefined four QoS objects:

  • User-interactive: Operations that the user is directly interacting with such as animations or updates to your app’s user interface. This has the highest priority.
  • User-initiated: Operations that are directly initiated by the user. These operations need to be executed quickly.
  • Utility: This represents long-running computations. You should use this QoS for operations such as data import, network calls, etc.
  • Background: This represents operations that the user is not directly aware of. You should use it for maintenance or cleanup tasks that don’t need any user interaction.

There are two different special QoS classes that the user should not be used directly:

  • Default:

From docs,

The priority level of this QoS falls between user-initiated and utility. This QoS is not intended to be used by developers to classify work. Work that has no QoS information assigned is treated as default, and the GCD global queue runs at this level.

  • Unspecified:

From docs,

This represents the absence of QoS information and cues the system that an environmental QoS should be inferred. Threads can have an unspecified QoS if they use legacy APIs that may opt the thread out of QoS.

Managing Tasks and Shared resources


GCD provides many handy APIs to manage tasks that are being submitted to the queues. Any task that is submitted to the GCD is a DispatchWorkItem .

Any DispatchWorkItem can be canceled before it is executed by calling cancel() over it.

You can also check whether a DispatchWorkItem is canceled or not by the isCancelled property.


DispatchGroup will be used to batch tasks and receive a notification when all the tasks are complete. You can group tasks and make them perform in a particular order.

DispatchGroup has the following operations:

  • Enter - Call enter() to manually notify the group that a task has started.
  • Leave - Notify the group that this work is done by calling leave().
  • Notify - notify(queue:work:) submits a block to the group which is executed when all tasks have completed execution.

The number of enter() calls has to be same as the number of leave() calls , or else your app will crash.


DispatchBarrier helps prevent readers–writers problems while dealing with shared resources.

Set the .barrier flag while submitting the task to the queue as a barrier task. .barrier is a type of DispatchWorkItemFlag. DispatchWorkItemFlag has types that help you customize the behavior of your work item. .barrier would ensure that the barrier task is the only task that gets executed by the queue. The queue returns to its default state of execution once the barrier task has been completed.

You can use barriers in a custom concurrent queue to ensure thread safety while performing read-write interactions with shared resources.

Let us take the following example

Here, all tasks i.e Task 1 and Task 2 submitted to the queue before the barrier task i.e Task 3 submission is executed and then the barrier task is executed. Task 4 is submitted after Task 3 and is executed after the barrier task, Task 3 is completed.


A semaphore is a counter variable or an abstract that is used to manage access to a shared resource by multiple threads. GCD provides DispatchSemaphore class to help you manage shared resources across a multi-threaded environment.

A thread gets access to the shared resource when the counter is ≥ 0 and is moved to the waiting thread queue when the counter is < 0.

Using a semaphore in iOS has the following steps:

  1. You set the DispatchSemaphore with an initial value. This value depicts the number of threads that will access the shared resource.
let semaphore = DispatchSemaphore(value: 1)

2. Whenever you want to access the shared resource you request the semaphore to wait()


This decrements the counter value by 1. Now the counter sets to 0 and hence this operation gets executed.

3. Once you are done with the shared resource, you need to call signal() . This increments the counter by one.


Let us look at a simple example of a semaphore:

I would love to hear from you

You can reach me for any query, feedback, or just want to have a discussion through the following channels:

Twitter — @gabhisek_dev


Please feel free to give a few claps and share this blog with your fellow developers.