Thread explosion is a situation where multiple threads run simultaneously. It can cause performance degradation and memory overhead. Swift Concurrency helps prevent them.Thread explosion is a situation where multiple threads run simultaneously. It can cause performance degradation and memory overhead. Swift Concurrency helps prevent them.

A Guide on How to Eliminate Thread Explosions in iOS: GCD and Swift Concurrency

2025/12/02 10:53
7 min read
For feedback or concerns regarding this content, please contact us at crypto.news@mexc.com

Introduction

Thread explosion is a situation where multiple threads run simultaneously, causing performance degradation and memory overhead. In this article, we explore how to eliminate thread explosions and how Swift Concurrency helps prevent them.

Thread Explosion in GCD

Foundation

The system doesn’t provide an exact answer to how many threads we have. Based on WWDC Swift Concurrency Behind the Scenes, we can conclude that there are 16 threads per CPU core.

\ Let’s consider the following code:

import Foundation let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue", attributes: .concurrent) for _ in 0...127 { queue.async { sleep(5) } }

\ The queue is a concurrent queue, and it schedules 128 tasks concurrently without limiting the number of active threads. Each task simulates a real-world heavy operation. Due to concurrent execution, the system might spawn numerous threads, causing performance degradation and increased CPU usage.

The output is separated into two groups, each containing 64 elements, due to a thread limit of 64. These 64 operations can be executed in any order because we are dealing with a concurrent queue.

\ Grand Central Dispatch (GCD) doesn’t have a built-in mechanism to prevent thread explosion. Next, we examine thread explosion in concurrent and serial GCD queues.

Deadlocks in Concurrent Queues

The thread explosion may lead to deadlocks in concurrent queues. Let’s consider the following example:

import Foundation let queue1 = DispatchQueue(label: "com.nsvasilev.concurrent-queue1", attributes: .concurrent) let queue2 = DispatchQueue(label: "com.nsvasilev.concurrent-queue2", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 0) (0..<64).forEach { _ in queue1.async { dispatchSemaphore.wait() } } (0..<64).forEach { _ in queue2.async { dispatchSemaphore.signal() } }

\ It may not be obvious that this code will cause a deadlock. The first concurrent queue schedules 64 tasks, each waiting on a semaphore. In this particular case, the thread limit is 64, meaning all available threads are occupied by these tasks. However, none of the tasks can proceed because they are all blocked, waiting for a signal from the second queue. Meanwhile, the second queue is also trying to run its tasks, which involve signaling the semaphore.

\ Deadlock

But, since all threads are blocked by the first queue’s wait() calls, none of the signals from queue2 can be processed, resulting in a deadlock where both queues are waiting for each other indefinitely.

\ There are three possible solutions.

  1. Use OperationQueue to limit simultaneous tasks.

    let operationQueue = OperationQueue() operationQueue.maxConcurrentOperationCount = 5 operationQueue.qualityOfService = .background

    let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue2", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 0)

    (0..<64).forEach { _ in operationQueue.addOperation { dispatchSemaphore.wait() } }

    (0..<64).forEach { _ in queue.async { dispatchSemaphore.signal() } }

    \

  2. Use DispatchSemaphore to limit simultaneous tasks.

    let queue = DispatchQueue(label: "com.nsvasilev.concurrent-queue", attributes: .concurrent) let dispatchSemaphore = DispatchSemaphore(value: 3)

    (0..<128).forEach { index in dispatchSemaphore.wait() queue.async { dispatchSemaphore.signal() } }

    \

DispatchSemaphore controls access to the queue and does not allow performing more than 3 operations at a time.

  1. Use Swift Concurrency to prevent thread explosion.

    \

Swift Concurrency can manage thread explosion. We will explore how it prevents thread explosion later in this article.

Deadlocks in Serial Queues

Deadlocks can occur in serial queues as well. The GCD thread limit for both concurrent and serial queues is capped at 512.

let dispatchSemaphore = DispatchSemaphore(value: 0) for _ in 0...511 { let queue1 = DispatchQueue(label: "com.nsvasilev.concurrent-queue1") queue1.async { dispatchSemaphore.wait() } } for _ in 0...511 { let queue2 = DispatchQueue(label: "com.nsvasilev.concurrent-queue2") queue2.asyncAfter(deadline: .now() + 1.0) { dispatchSemaphore.signal() } }

\ Serial Queues Deadlock

\ The inner for-loop creates 512 serial queues, consuming all available threads. As a result, the first loop occupies all threads, causing a deadlock because there are no threads left to execute the second loop’s queue operations.

Swift Concurrency

Swift Concurrency prevents thread explosion, and in this part, we are going to take a look at how it works. Let’s explore how Swift Concurrency manages tasks efficiently and avoids creating excessive threads.

Priorities

In Swift Concurrency, we have only three task priorities: .userInitiated, .utility, .background.

  • .userInitiated is a high-priority task for user-initiated actions.
  • .utility is a medium-priority task for background processing.
  • .background is for low-priority tasks that can run in the background.

\ Let’s consider some examples:

// The high-priority task for user-initiated actions. Task(priority: .userInitiated) { await loadSomeData() } // The medium-priority task for background processing. Task(priority: .utility) { await processDataInBackground() } // The low-priority tasks that can run in the background. Task(priority: .background) { await performBackgroundCleanup() }

\ In this example, await loadSomeData() is assigned the highest priority with .userInitiated, indicating that it is a user-focused task requiring prompt execution. Background tasks, such as cleanup, are set to .background priority, allowing them to run without blocking critical operations.

How Swift Concurrency Is Managing Threads

Let’s consider examples that convert the cases from the previous section, but using Swift Concurrency. The task will perform a “heavy” operation, and we will run it with different priorities.

Tasks With the Same Priority Level

Let’s consider the following example:

func runTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } for _ in 0...127 { runTask(seconds: 2) }

\ We can see that only 6 tasks are performed simultaneously, which is equal to the number of CPU cores on my device.

\ User Initiated Tasks

If you pause the execution, you may get a clearer picture of what happens behind the scenes.

\ The Number of Tasks

\ We may notice that all of these operations are performed inside com.apple.root.user-initiated-qos.cooperative, which limits the number of threads so that it doesn’t exceed the number of CPU cores.

\ Based on this observation, it’s clear that Swift Concurrency prevents thread explosion and doesn’t create more threads than there are CPU cores on the device.

Tasks With All Priority Levels at Once.

Let’s look at another example. In this case, we’ll run tasks with different priorities and observe what happens.

func runUserInitiatedTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } func runUserUtilityTask(seconds: UInt32) { Task(priority: .utility) { print("Utility: \(Date())") sleep(seconds) } } func runUserBackgroundTask(seconds: UInt32) { Task(priority: .background) { print("Background: \(Date())") sleep(seconds) } } for _ in 0...127 { runUserInitiatedTask(seconds: 2) } for _ in 0...127 { runUserUtilityTask(seconds: 2) } for _ in 0...127 { runUserBackgroundTask(seconds: 2) }

\ As we can see, the .utility and .background queues are limited to 1 thread when there’s a higher-priority queue (.userInitiated).

\ The Number of Tasks

In this specific scenario, the maximum number of threads is 8.

The Last Example

What if we add some delays before starting each group of tasks?

\ In this example, we demonstrate how introducing delays before starting each group of tasks can impact their execution. We have three types of tasks, each with a different priority: .background, .utility, and .userInitiated. Each task sleeps for a specified duration, and tasks are executed in three groups, with a 2-second delay between each group.

func runUserBackgroundTask(seconds: UInt32) { Task(priority: .background) { print("Background: \(Date())") sleep(seconds) } } func runUserUtilityTask(seconds: UInt32) { Task(priority: .utility) { print("Utility: \(Date())") sleep(seconds) } } func runUserInitiatedTask(seconds: UInt32) { Task(priority: .userInitiated) { print("User Initiated: \(Date())") sleep(seconds) } } for _ in 0...127 { runUserBackgroundTask(seconds: 2) } sleep(2) for _ in 0...127 { runUserUtilityTask(seconds: 2) } sleep(2) for _ in 0...127 { runUserInitiatedTask(seconds: 2) }

We can see that all three queues—background, utility, and user-initiated are processing multiple threads simultaneously. Interestingly, if the lower-priority queue is started first and given some time to run, the higher-priority queue doesn’t seem to negatively impact the performance of the lower-priority queue.

\ This suggests that the system is capable of handling tasks with different priorities efficiently, maintaining a balance even when tasks run concurrently.

Conclusion

While GCD doesn’t inherently prevent thread explosion, it offers tools for managing concurrency, such as task queues and dispatch groups. However, thread explosion can occur if tasks are dispatched incorrectly or in excess. In contrast, Swift Concurrency improves upon this by offering a more structured and efficient approach to concurrency management.

\ By prioritizing tasks and limiting concurrency based on available resources, Swift Concurrency reduces the risk of thread explosion and optimizes performance, ensuring that multiple tasks can run concurrently without overwhelming the system.

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact crypto.news@mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Is Doge Losing Steam As Traders Choose Pepeto For The Best Crypto Investment?

Is Doge Losing Steam As Traders Choose Pepeto For The Best Crypto Investment?

The post Is Doge Losing Steam As Traders Choose Pepeto For The Best Crypto Investment? appeared on BitcoinEthereumNews.com. Crypto News 17 September 2025 | 17:39 Is dogecoin really fading? As traders hunt the best crypto to buy now and weigh 2025 picks, Dogecoin (DOGE) still owns the meme coin spotlight, yet upside looks capped, today’s Dogecoin price prediction says as much. Attention is shifting to projects that blend culture with real on-chain tools. Buyers searching “best crypto to buy now” want shipped products, audits, and transparent tokenomics. That frames the true matchup: dogecoin vs. Pepeto. Enter Pepeto (PEPETO), an Ethereum-based memecoin with working rails: PepetoSwap, a zero-fee DEX, plus Pepeto Bridge for smooth cross-chain moves. By fusing story with tools people can use now, and speaking directly to crypto presale 2025 demand, Pepeto puts utility, clarity, and distribution in front. In a market where legacy meme coin leaders risk drifting on sentiment, Pepeto’s execution gives it a real seat in the “best crypto to buy now” debate. First, a quick look at why dogecoin may be losing altitude. Dogecoin Price Prediction: Is Doge Really Fading? Remember when dogecoin made crypto feel simple? In 2013, DOGE turned a meme into money and a loose forum into a movement. A decade on, the nonstop momentum has cooled; the backdrop is different, and the market is far more selective. With DOGE circling ~$0.268, the tape reads bearish-to-neutral for the next few weeks: hold the $0.26 shelf on daily closes and expect choppy range-trading toward $0.29–$0.30 where rallies keep stalling; lose $0.26 decisively and momentum often bleeds into $0.245 with risk of a deeper probe toward $0.22–$0.21; reclaim $0.30 on a clean daily close and the downside bias is likely neutralized, opening room for a squeeze into the low-$0.30s. Source: CoinMarketcap / TradingView Beyond the dogecoin price prediction, DOGE still centers on payments and lacks native smart contracts; ZK-proof verification is proposed,…
Share
BitcoinEthereumNews2025/09/18 00:14
Fed-up Lauren Boebert throws Trump's own words back in his face

Fed-up Lauren Boebert throws Trump's own words back in his face

President Donald Trump is leaning hard on the House GOP to pass Foreign Intelligence Surveillance Act reauthorization — but far-right Rep. Lauren Boebert (R-CO)
Share
Rawstory2026/03/26 04:41
Markets await Fed’s first 2025 cut, experts bet “this bull market is not even close to over”

Markets await Fed’s first 2025 cut, experts bet “this bull market is not even close to over”

Will the Fed’s first rate cut of 2025 fuel another leg higher for Bitcoin and equities, or does September’s history point to caution? First rate cut of 2025 set against a fragile backdrop The Federal Reserve is widely expected to…
Share
Crypto.news2025/09/18 00:27