Apple’s Developer Conference WWDC just finished its 2023 edition, and just like every year, there has been a lot of new and updated developer content for all Apple platforms.
In this post, I want to introduce a few of my favourite announcements and give a brief overview of how they work.
SwiftData
SwiftData is a new Swift native framework introduced at WWDC 2023 and is available in iOS/iPadOS 17, macOS 14, and adjacent releases. SwiftData enables easy data modelling, data persistence, and even cloud sync for your app’s data.
Simply add the new @Model
macro to your model class, and you’re ready to go. SwiftData offers easy integration with SwiftUI through the new .modelContainer
and .modelContext
view modifiers.
Here’s a quick example of using SwiftData together with SwiftUI:
First, we define our model class using the @Model
macro.
import Foundation import SwiftData @Model class Album { @Attribute(.unique) var id: String var artist: String var title: String var tracklist: [Track] }
Next, we have to specify the modelContainer
to our App’s Scene:
struct MusicApp: App { var body: some Scene { WindowGroup { MusicView() } .modelContainer(for: [Album.self, Track.self]) } }
Last, we use the new @Query
macro to fetch the data we want to display. The modelContext
can be used to insert, update and delete data, as well as save any changes to disk.
import SwiftUI import SwiftData struct MusicView: View { @Query(sort: \.title) var albums: [Album] @Environment(\.modelContext) var modelContext var body: some View { ForEach(albums) { album in AlbumView(album: album) } Button("Add Album") { let newAlbum = Album( artist: "Hot Mulligan", title: "Why Would I Watch" ) modelContext.insert(newAlbum) } } }
SwiftData models also implement the new Observable
macro, so your SwiftUI code automatically responds to changes in your model and updates your views accordingly.
The mentioned above is only a brief introduction to what SwiftData is capable of, so make sure to check out the official documentation and the available developer session videos.
Swift Macros
Swift 5.9 introduces Macros to the Swift language. Swift Macros transform your source code by adding additional code before compilation. Macros can only add new code to your app; they never delete or modify existing code.
There are two types of macros in Swift 5.9:
- Freestanding Macros
- Attached Macros
Freestanding Macros
Freestanding macros can be used anywhere in your codebase and are preceded by a number sign #
before their name.
Doug Gregor of the Swift team provides a repository of example macros, including this useful #URL()
macro, which checks if hard-coded URLs are valid at compile time.
Until now, if you wanted to include a hardcoded URL in your swift code, you needed to additionally null-check the resulting URL? optional:
guard let url = URL(string: "https://mayflower.de") else { return }
Using the #URL
macro, the URL string is verified at compile time and automatically unwrapped to URL:
let url = #URL("https://blog.mayflower.de") //Compiles let faultyUrl = #URL("https://blog. mayflower. de") //Compiler error
Attached Macros
Attached macros are directly attached to a declaration and are preceded by an at sign @
before their name.
The most prominent examples of attached macros are the new @Observable
and @Model
macros.
Observable
Adding @Observable
to your class automatically makes it conform to the Observable
interface. All public variables are published by default, so you don’t have to mark them with @Published
anymore.
@Observable class MusicAppViewModel { var playerState = PlayerState.paused var nowPlaying = "" func updatePlayer(state: PlayerState, nowPlaying: String) { //All changes are automatically published to Observers self.playerState = state self.nowPlaying = nowPlaying } }
Of course, SwiftUI integrates directly with @Observable
:
struct PlayerView: View { var viewModel: MusicAppViewModel var body: some View { if (viewModel.playerState == .playing) { Text(viewModel.nowPlaying) } else { Text("paused") } } }
The view automatically updates itself when viewModel
publishes any changes.
Notice how we don’t have to mark the observed viewModel
with @StateObject
or @ObservedObject
anymore; this is now handled automatically by the macro.
Check out the official documentation and the available developer session videos for more information, including how to write your own macros.
Interactive Widgets
Widgets and WidgetKit were introduced with iOS 14 and have been widely adopted by Developers.
This year, iOS/iPadOS 17 and macOS 14 add the (probably) most requested feature: Widgets can now include buttons and toggles, which can run code within your app without launching it in the foreground.
Code execution is handled using the existing AppIntents
API. Simply define a new AppIntent
…
struct PlayPauseIntent: AppIntent { static var title: LocalizedStringResource = "Toggle Play/Pause" func perform() async throws -> some IntentResult { MusicAppViewModel.togglePlayPause() return .result() } }
… and add it to your widget’s button.
import SwiftUI struct NowPlayingWidget: View { var entry: SimpleEntry var body: some View { VStack { Text("Now Playing") Button(intent: PlayPauseIntent() ) { Text("Play/Pause") } } } }
Check out the official documentation and all available developer session videos to learn more.
Swift OpenAPI Generator
Swift OpenAPI Generator is a new Swift package that generates client-side code to make API calls based on an existing OpenAPI (>=v3.0) specification.
It can also be used to generate a server-side implementation of your API. Generated code is compatible with Apple platforms as well as Linux (for example, using the Vapor Web Framework).
Check out the official documentation and the provided sample code for more information.
Spatial Computing for Apple Vision Pro with SwiftUI
Last (but certainly not least), Apple’s big WWDC announcement this year was the introduction of Apple Vision Pro and its operating system VisionOS.
Of course, SwiftUI got some enhancements to facilitate Apps for the new VisionOS platform. How your SwiftUI app is being presented on VisionOS depends on the used Scene type:
WindowGroup
Apps using the WindowGroup as Scene are rendered as 2D windows, with automatic depth-sensitive 3D controls for common containers (like NavigationSplitView
or TabView
).
Volumetric Window Style
If you want to display full 3D experiences, you can use the new volumetric window style on WindowGroup:
WindowGroup { ... } .windowStyle(.volumetric)
This allows rendering full 3D objects using Model3D
(static models) or the new RealityView
(dynamic and interactive models with lighting effects).
ImmersiveSpace
For an even more immersive experience, you can use the new ImmersiveSpace
scene type, which hides all other currently visible apps.
ImmersiveSpace
has two immersion styles:
Mixed
: Your content is still presented in your “real world” surroundingsFull
: Your app takes complete control of the presented surroundings, enabling a fully immersive experience
Note: As of the writing of this post, Apple has not yet released the Developer SDK for VisionOS, meaning it’s currently impossible to try those features out for yourself.
Check out the official documentation and developer session videos for more information on how to bring new and existing apps to VisionOS.
Conclusion
These were just a few of my favourite developer announcements this WWDC23. I can’t wait to spend more time with the new/updated APIs and use them in my own apps.
Also, stay tuned for some more iOS/SwiftUI-related content this summer.
Thanks for reading <3
Schreibe einen Kommentar