Concurrency in DeTeXt
July 5 2025
I haven’t had a traditional formal CS education, which hasn’t held me back from getting things done with computer programming. However, I have slowly accumulated a long list of blind spots because of the high-level of abstraction modern scripting languages like Python provide. One of these concepts is concurrency. This blog post will be very high-level, but it’s my understanding of these difficult concepts at the moment, and it did help me implement Swift concurrency in my app DeTeXt.1
Sky-high overview
The first misconception I had to forego was the link between concurrency and parallelism. Rob Pike describes the distinction in a succinct way:
Concurrency is about dealing with multiple things at once. Parallelism is about doing multiple things at once.
The dealing with part is important! I didn’t understand the need for this because of how removed I’ve been from low-level programming. YouTuber Core Dumped has [a really good video about concurrency][coredumped], and his explanation finally helped me understand why concurrency is not only important, but crucially the default. Processors execute one instruction at a time — but modern operating systems run hundreds of active processes at any given time. The Operating System (OS) is in charge of figuring out which process should get access to system resources. Modern CPUs are so fast that we’re under the illusion that all processes are running simultaneously2
This fundamental truth about deadling with multiple things that need to get done applies to individual applications/programs as well. Thankfully, application developers can rely on the APIs and abstractions provided by the OS to handle concurrency. For developing applications on Apple platforms in 2025, this means adopting language support for concurrency in Swift.
For my simple app DeTeXt, there were two instances where I needed to deal with concurrency in my code:
- running image recognition with my CoreML model as a separate task on the main thread.
- showing a temporary toast-style pop-up when the user copies a symbol/command/unicode code-point. Once again, this is a task that runs on the main thread.
But what’s a thread? Or a task? And why am I running things on the main thread?
Threads
A process is a program in execution. It has its own program counter, register information, and memory space. However, programs themselves need to do multiple things at once within them. Threads are the abstraction that almost all OSes have settled on to handle concurrency within a process. Every process has a main thread, which is the initial thread where all work first happens. The OS typically initiates subsequent threads based on developer instructions.
For a user-facing mobile application the cardinal rule is that all user interface (UI) work must happen on the main thread. Updating the user interface is a short-term operation that immediately affects the user experience; Users can tolerate waiting for a large download or file save, but the app itself should never crash or lag. Critical operations happen on the main thread, which has OS-level priority for the application.
DeTeXt is a pretty simple app — it doesn’t download or upload anything, all symbols and images are loaded on start-up, and they’re only 10MB anyway. Implementing concurrency in DeTeXt simply meant identifying (and marking) asynchronous functions and possible suspension points (where possible long-running tasks can occur), and instructing the OS to run all asynchronous functions on the main thread, which it was doing anyway.
Actors and Tasks
Swift’s concurrency model doesn’t let us work with threads directly. Instead we work with 2 abstractions — tasks and actors.
A unit of work that we need to handle with concurrency is a task. Fetching web resources, reading/writing from the file system are traditional examples of a task suited for concurrent thinking/processing. For DeTeXt, I defined 2 tasks that encapsulated the 2 asynchronous functions3 mentioned earlier:
- Taking the drawing from the on-screen canvas, pre-processing it, and processing it through the neural net that calculates probabilities for every symbol. The probabilities are stored in a reference type marked Observable —any changes to the underlying data send notifications to SwiftUI views that observe it.
- Displaying the name of the command, or the symbol itself and showing it as a toast-style pop-up on screen for a set, constant period of time, then automatically dismissing it.
Now both of these need to run on the main thread, since they update the UI in both cases. However, both tasks can take undetermined amounts of time to finish. The CPU/GPU/Neural engine might be clogged up doing some other intensive process (very unlikely but possible), and the toast task needs to pause for a set amount of time before finishing. Implementing this in DeTeXt couldn’t have been simpler: package the asynchronous function call to the CoreML model and the toast suspension/sleep as tasks. Now, we need to ensure that both tasks run on the main thread, for which we turn to another abstraction: Actors.
Actors are objects that ensure that only one function has access to mutable data at a time. Their role in concurrency is to avoid data races. The Main Actor is in charge of updating all the data that updates the UI. Since the only mutable data in my app pertains to the UI, I simply needed to mark the two asynchronous functions with the @MainActor
attribute, instructing the OS that while there may be suspension points in these two functions, both of them impact the UI, so ensure that these functions only run on the Main Actor. The Main Actor abstracts over the main thread — they’re very similar, and the differences have more to do with low-level implementation details.
Fin
I learned a lot about concurrency and asynchronous functions while re-writing my app to use Swift’s new concurrency features. You can view the actual code changes on the GitHub repo of course. To be honest, writing this blog post took more time than actually learning and implementing Swift concurrency in my app! I love explainer blog posts and videos, so I figured I’d give a shot at writing my own, for me.
-
I didn’t need to add concurrency as it turned out, but it was a good excuse to learn the underlying concepts. ↩
-
On CPUs with multiple cores, multiple processes do run simultaneously. But even the beefiest CPU from Apple has ‘only’ 32 cores. I had 869 active processes when I was writing this post. ↩
-
Asynchronous functions run as part of some task — the task abstraction enables structured concurrency, which I haven’t explored yet. ↩