Async/Await in Swift: Best Practices
Master Swift's concurrency model with async/await. Learn how to write clean, safe, and efficient asynchronous code while avoiding common pitfalls and performance issues.
Published on November 8, 2024 • 13 min read
Introduction
Swift's async/await concurrency model has revolutionized how we write asynchronous code, making it more readable, maintainable, and less error-prone than traditional completion handler patterns. However, with great power comes the need for understanding best practices to avoid common pitfalls and performance issues.
In this comprehensive guide, I'll share the patterns and practices I've learned from implementing async/await in production iOS applications. We'll cover everything from basic patterns to advanced concurrency techniques, error handling, and performance optimization strategies.
Understanding Async/Await Fundamentals
Before diving into best practices, it's crucial to understand how async/await works under the hood and how it differs from completion handlers.
Basic Async Function:
// Traditional completion handler approach
func fetchUser(id: String, completion: @escaping (Result<User, Error>) -> Void) {
URLSession.shared.dataTask(with: URL(string: "api/users/\(id)")!) { data, response, error in
if let error = error {
completion(.failure(error))
return
}
guard let data = data else {
completion(.failure(APIError.noData))
return
}
do {
let user = try JSONDecoder().decode(User.self, from: data)
completion(.success(user))
} catch {
completion(.failure(error))
}
}.resume()
}
// Modern async/await approach
func fetchUser(id: String) async throws -> User {
let url = URL(string: "api/users/\(id)")!
let (data, _) = try await URLSession.shared.data(from: url)
return try JSONDecoder().decode(User.self, from: data)
}
func fetchUser(id: String, completion: @escaping (Result<User, Error>) -> Void) {
URLSession.shared.dataTask(with: URL(string: "api/users/\(id)")!) { data, response, error in
if let error = error {
completion(.failure(error))
return
}
guard let data = data else {
completion(.failure(APIError.noData))
return
}
do {
let user = try JSONDecoder().decode(User.self, from: data)
completion(.success(user))
} catch {
completion(.failure(error))
}
}.resume()
}
// Modern async/await approach
func fetchUser(id: String) async throws -> User {
let url = URL(string: "api/users/\(id)")!
let (data, _) = try await URLSession.shared.data(from: url)
return try JSONDecoder().decode(User.self, from: data)
}
The async/await version is not only more concise but also eliminates callback nesting and makes error handling more straightforward.
Structured Concurrency Patterns
Structured concurrency ensures that async operations are properly managed and cancelled when their parent context is cancelled. Here are the key patterns:
1. TaskGroup for Parallel Operations:
func fetchMultipleUsers(ids: [String]) async throws -> [User] {
return try await withThrowingTaskGroup(of: User.self) { group in
// Add tasks for each user ID
for id in ids {
group.addTask {
try await self.fetchUser(id: id)
}
}
// Collect results
var users: [User] = []
for try await user in group {
users.append(user)
}
return users
}
}
// Alternative: Collect results as they complete
func fetchUsersStreamingResults(ids: [String]) async throws -> [User] {
try await withThrowingTaskGroup(of: User.self) { group in
for id in ids {
group.addTask { try await self.fetchUser(id: id) }
}
// Process results as they become available
var users: [User] = []
users.reserveCapacity(ids.count)
for try await user in group {
users.append(user)
// Could update UI here for streaming updates
}
return users
}
}
return try await withThrowingTaskGroup(of: User.self) { group in
// Add tasks for each user ID
for id in ids {
group.addTask {
try await self.fetchUser(id: id)
}
}
// Collect results
var users: [User] = []
for try await user in group {
users.append(user)
}
return users
}
}
// Alternative: Collect results as they complete
func fetchUsersStreamingResults(ids: [String]) async throws -> [User] {
try await withThrowingTaskGroup(of: User.self) { group in
for id in ids {
group.addTask { try await self.fetchUser(id: id) }
}
// Process results as they become available
var users: [User] = []
users.reserveCapacity(ids.count)
for try await user in group {
users.append(user)
// Could update UI here for streaming updates
}
return users
}
}
2. Sequential Operations with Dependencies:
func setupUserProfile(userId: String) async throws -> UserProfile {
// Step 1: Fetch user data
let user = try await fetchUser(id: userId)
// Step 2: Fetch user preferences (depends on user data)
let preferences = try await fetchUserPreferences(userId: user.id)
// Step 3: Parallel fetch of independent data
async let avatarImage = fetchUserAvatar(url: user.avatarURL)
async let friendsList = fetchUserFriends(userId: user.id)
async let activityFeed = fetchUserActivity(userId: user.id)
// Wait for all parallel operations to complete
let (avatar, friends, activity) = try await (avatarImage, friendsList, activityFeed)
return UserProfile(
user: user,
preferences: preferences,
avatar: avatar,
friends: friends,
recentActivity: activity
)
}
// Step 1: Fetch user data
let user = try await fetchUser(id: userId)
// Step 2: Fetch user preferences (depends on user data)
let preferences = try await fetchUserPreferences(userId: user.id)
// Step 3: Parallel fetch of independent data
async let avatarImage = fetchUserAvatar(url: user.avatarURL)
async let friendsList = fetchUserFriends(userId: user.id)
async let activityFeed = fetchUserActivity(userId: user.id)
// Wait for all parallel operations to complete
let (avatar, friends, activity) = try await (avatarImage, friendsList, activityFeed)
return UserProfile(
user: user,
preferences: preferences,
avatar: avatar,
friends: friends,
recentActivity: activity
)
}
Error Handling Best Practices
Proper error handling in async/await requires understanding how errors propagate and implementing appropriate recovery strategies.
1. Graceful Error Recovery:
func fetchUserWithFallback(id: String) async -> User? {
do {
// Try primary data source
return try await fetchUser(id: id)
} catch {
print("Primary fetch failed: \(error.localizedDescription)")
do {
// Try cache as fallback
return try await fetchUserFromCache(id: id)
} catch {
print("Cache fetch failed: \(error.localizedDescription)")
return nil
}
}
}
// For partial failure tolerance
func fetchUsersWithPartialFailure(ids: [String]) async -> [User] {
await withTaskGroup(of: User?.self) { group in
for id in ids {
group.addTask {
do {
return try await self.fetchUser(id: id)
} catch {
print("Failed to fetch user \(id): \(error)")
return nil
}
}
}
var users: [User] = []
for await user in group {
if let user = user {
users.append(user)
}
}
return users
}
}
do {
// Try primary data source
return try await fetchUser(id: id)
} catch {
print("Primary fetch failed: \(error.localizedDescription)")
do {
// Try cache as fallback
return try await fetchUserFromCache(id: id)
} catch {
print("Cache fetch failed: \(error.localizedDescription)")
return nil
}
}
}
// For partial failure tolerance
func fetchUsersWithPartialFailure(ids: [String]) async -> [User] {
await withTaskGroup(of: User?.self) { group in
for id in ids {
group.addTask {
do {
return try await self.fetchUser(id: id)
} catch {
print("Failed to fetch user \(id): \(error)")
return nil
}
}
}
var users: [User] = []
for await user in group {
if let user = user {
users.append(user)
}
}
return users
}
}
2. Custom Error Types: Create meaningful error types for better error handling:
enum APIError: Error, LocalizedError {
case invalidURL
case noData
case decodingError(Error)
case networkError(Error)
case serverError(statusCode: Int)
var errorDescription: String? {
switch self {
case .invalidURL:
return "Invalid URL provided"
case .noData:
return "No data received from server"
case .decodingError(let error):
return "Failed to decode data: \(error.localizedDescription)"
case .networkError(let error):
return "Network error: \(error.localizedDescription)"
case .serverError(let statusCode):
return "Server error with status code: \(statusCode)"
}
}
}
case invalidURL
case noData
case decodingError(Error)
case networkError(Error)
case serverError(statusCode: Int)
var errorDescription: String? {
switch self {
case .invalidURL:
return "Invalid URL provided"
case .noData:
return "No data received from server"
case .decodingError(let error):
return "Failed to decode data: \(error.localizedDescription)"
case .networkError(let error):
return "Network error: \(error.localizedDescription)"
case .serverError(let statusCode):
return "Server error with status code: \(statusCode)"
}
}
}
Task Management and Cancellation
Proper task management is crucial for preventing memory leaks and ensuring responsive apps. Here's how to handle task lifecycle effectively:
1. Task Cancellation Patterns:
class DataManager: ObservableObject {
@Published var users: [User] = []
@Published var isLoading = false
private var loadingTask: Task<Void, Never>?
func loadUsers() {
// Cancel previous loading task
loadingTask?.cancel()
loadingTask = Task { @MainActor in
isLoading = true
do {
let fetchedUsers = try await fetchAllUsers()
// Check for cancellation
try Task.checkCancellation()
self.users = fetchedUsers
} catch is CancellationError {
print("Task was cancelled")
} catch {
print("Failed to load users: \(error)")
}
isLoading = false
}
}
func cancelLoading() {
loadingTask?.cancel()
loadingTask = nil
}
deinit {
loadingTask?.cancel()
}
}
@Published var users: [User] = []
@Published var isLoading = false
private var loadingTask: Task<Void, Never>?
func loadUsers() {
// Cancel previous loading task
loadingTask?.cancel()
loadingTask = Task { @MainActor in
isLoading = true
do {
let fetchedUsers = try await fetchAllUsers()
// Check for cancellation
try Task.checkCancellation()
self.users = fetchedUsers
} catch is CancellationError {
print("Task was cancelled")
} catch {
print("Failed to load users: \(error)")
}
isLoading = false
}
}
func cancelLoading() {
loadingTask?.cancel()
loadingTask = nil
}
deinit {
loadingTask?.cancel()
}
}
2. Cooperative Cancellation: Check for cancellation in long-running operations:
func processLargeDataset(_ items: [DataItem]) async throws -> [ProcessedItem] {
var processedItems: [ProcessedItem] = []
for (index, item) in items.enumerated() {
// Check for cancellation every 10 items
if index % 10 == 0 {
try Task.checkCancellation()
}
let processed = await processItem(item)
processedItems.append(processed)
// Yield to other tasks periodically
if index % 100 == 0 {
await Task.yield()
}
}
return processedItems
}
var processedItems: [ProcessedItem] = []
for (index, item) in items.enumerated() {
// Check for cancellation every 10 items
if index % 10 == 0 {
try Task.checkCancellation()
}
let processed = await processItem(item)
processedItems.append(processed)
// Yield to other tasks periodically
if index % 100 == 0 {
await Task.yield()
}
}
return processedItems
}
Performance Optimization
Async/await performance optimization involves understanding task scheduling, avoiding unnecessary context switches, and implementing efficient concurrency patterns.
1. Efficient Batch Processing:
func processBatchesEfficiently<T, R>(
items: [T],
batchSize: Int = 10,
maxConcurrency: Int = 3,
processor: @escaping (T) async throws -> R
) async throws -> [R] {
let batches = items.chunked(into: batchSize)
return try await withThrowingTaskGroup(of: [R].self, returning: [R].self) { group in
var results: [R] = []
var activeTasks = 0
var batchIndex = 0
// Start initial batch of tasks
while activeTasks < maxConcurrency && batchIndex < batches.count {
let batch = batches[batchIndex]
group.addTask {
try await batch.asyncMap(processor)
}
activeTasks += 1
batchIndex += 1
}
// Process results and add new tasks as they complete
for try await batchResult in group {
results.append(contentsOf: batchResult)
activeTasks -= 1
// Add next batch if available
if batchIndex < batches.count {
let batch = batches[batchIndex]
group.addTask {
try await batch.asyncMap(processor)
}
activeTasks += 1
batchIndex += 1
}
}
return results
}
}
items: [T],
batchSize: Int = 10,
maxConcurrency: Int = 3,
processor: @escaping (T) async throws -> R
) async throws -> [R] {
let batches = items.chunked(into: batchSize)
return try await withThrowingTaskGroup(of: [R].self, returning: [R].self) { group in
var results: [R] = []
var activeTasks = 0
var batchIndex = 0
// Start initial batch of tasks
while activeTasks < maxConcurrency && batchIndex < batches.count {
let batch = batches[batchIndex]
group.addTask {
try await batch.asyncMap(processor)
}
activeTasks += 1
batchIndex += 1
}
// Process results and add new tasks as they complete
for try await batchResult in group {
results.append(contentsOf: batchResult)
activeTasks -= 1
// Add next batch if available
if batchIndex < batches.count {
let batch = batches[batchIndex]
group.addTask {
try await batch.asyncMap(processor)
}
activeTasks += 1
batchIndex += 1
}
}
return results
}
}
2. Memory-Efficient Streaming: Process large datasets without loading everything into memory:
struct DataStream: AsyncSequence {
typealias Element = DataItem
private let source: DataSource
init(source: DataSource) {
self.source = source
}
func makeAsyncIterator() -> DataStreamIterator {
DataStreamIterator(source: source)
}
}
struct DataStreamIterator: AsyncIteratorProtocol {
private let source: DataSource
private var currentOffset = 0
init(source: DataSource) {
self.source = source
}
mutating func next() async throws -> DataItem? {
guard !Task.isCancelled else { return nil }
let item = try await source.fetchItem(at: currentOffset)
currentOffset += 1
return item
}
}
// Usage
func processDataStream() async throws {
let stream = DataStream(source: dataSource)
for try await item in stream {
let processed = await processItem(item)
await saveProcessedItem(processed)
}
}
typealias Element = DataItem
private let source: DataSource
init(source: DataSource) {
self.source = source
}
func makeAsyncIterator() -> DataStreamIterator {
DataStreamIterator(source: source)
}
}
struct DataStreamIterator: AsyncIteratorProtocol {
private let source: DataSource
private var currentOffset = 0
init(source: DataSource) {
self.source = source
}
mutating func next() async throws -> DataItem? {
guard !Task.isCancelled else { return nil }
let item = try await source.fetchItem(at: currentOffset)
currentOffset += 1
return item
}
}
// Usage
func processDataStream() async throws {
let stream = DataStream(source: dataSource)
for try await item in stream {
let processed = await processItem(item)
await saveProcessedItem(processed)
}
}
SwiftUI Integration
Integrating async/await with SwiftUI requires careful attention to main actor isolation and proper task lifecycle management:
struct UserListView: View {
@StateObject private var viewModel = UserListViewModel()
var body: some View {
NavigationView {
List(viewModel.users) { user in
UserRowView(user: user)
}
.refreshable {
await viewModel.refresh()
}
.task {
await viewModel.loadUsers()
}
.alert("Error", isPresented: $viewModel.showError) {
Button("Retry") {
Task { await viewModel.loadUsers() }
}
Button("Cancel", role: .cancel) { }
} message: {
Text(viewModel.errorMessage)
}
}
}
}
@MainActor
class UserListViewModel: ObservableObject {
@Published var users: [User] = []
@Published var isLoading = false
@Published var showError = false
@Published var errorMessage = ""
private let userService = UserService()
func loadUsers() async {
isLoading = true
do {
users = try await userService.fetchUsers()
} catch {
errorMessage = error.localizedDescription
showError = true
}
isLoading = false
}
func refresh() async {
await loadUsers()
}
}
@StateObject private var viewModel = UserListViewModel()
var body: some View {
NavigationView {
List(viewModel.users) { user in
UserRowView(user: user)
}
.refreshable {
await viewModel.refresh()
}
.task {
await viewModel.loadUsers()
}
.alert("Error", isPresented: $viewModel.showError) {
Button("Retry") {
Task { await viewModel.loadUsers() }
}
Button("Cancel", role: .cancel) { }
} message: {
Text(viewModel.errorMessage)
}
}
}
}
@MainActor
class UserListViewModel: ObservableObject {
@Published var users: [User] = []
@Published var isLoading = false
@Published var showError = false
@Published var errorMessage = ""
private let userService = UserService()
func loadUsers() async {
isLoading = true
do {
users = try await userService.fetchUsers()
} catch {
errorMessage = error.localizedDescription
showError = true
}
isLoading = false
}
func refresh() async {
await loadUsers()
}
}
Common Pitfalls and How to Avoid Them
1. Avoiding Blocking the Main Thread: Always use @MainActor for UI updates and avoid synchronous blocking operations:
// ❌ Bad: Blocking main thread
func updateUI() {
DispatchQueue.main.sync {
// This blocks the main thread
self.users = fetchUsersSync()
}
}
// ✅ Good: Async main actor update
@MainActor
func updateUI() async {
self.users = try await fetchUsers()
}
func updateUI() {
DispatchQueue.main.sync {
// This blocks the main thread
self.users = fetchUsersSync()
}
}
// ✅ Good: Async main actor update
@MainActor
func updateUI() async {
self.users = try await fetchUsers()
}
2. Proper Task Cleanup: Always clean up tasks to prevent memory leaks:
// ❌ Bad: Task not cleaned up
class BadViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
Task {
// This task might continue running after view is deallocated
await loadData()
}
}
}
// ✅ Good: Proper task management
class GoodViewController: UIViewController {
private var loadingTask: Task<Void, Never>?
override func viewDidLoad() {
super.viewDidLoad()
loadingTask = Task {
await loadData()
}
}
override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
loadingTask?.cancel()
}
deinit {
loadingTask?.cancel()
}
}
class BadViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
Task {
// This task might continue running after view is deallocated
await loadData()
}
}
}
// ✅ Good: Proper task management
class GoodViewController: UIViewController {
private var loadingTask: Task<Void, Never>?
override func viewDidLoad() {
super.viewDidLoad()
loadingTask = Task {
await loadData()
}
}
override func viewDidDisappear(_ animated: Bool) {
super.viewDidDisappear(animated)
loadingTask?.cancel()
}
deinit {
loadingTask?.cancel()
}
}
Conclusion
Swift's async/await provides powerful tools for writing clean, efficient asynchronous code. By following the best practices outlined in this guide—proper error handling, task management, performance optimization, and avoiding common pitfalls—you can build robust, responsive iOS applications.
The key to mastering async/await is understanding its cooperative nature and designing your asynchronous operations to work well within the structured concurrency model. Start with simple patterns and gradually incorporate more advanced techniques as your understanding deepens.
Remember that async/await is not just about making code compile—it's about creating maintainable, performant, and user-friendly applications. Invest time in understanding these patterns, and your users will benefit from more responsive and reliable iOS apps.