Developers get their slice of the Apple Vision Pro

The visionOS SDK is available—let the apps dev begin!

Karen Moltenbrey

Apple has revealed its much-ballyhooed AR headset/spatial computer. While it will not be shipping until early 2024, developers are now able to get a jump on dev now that the Apple visionOS SDK is starting to roll out. 

A little over two weeks ago, we were introduced to the Apple Vision Pro, the company’s long-anticipated, much-speculated-about AR headset. Now that we got that first look out of our system for now, let’s move our focus to the creation of spatial experiences for the platform. After all, users will need things to keep them busy when the headset finally becomes available early next year, or so Apple estimates.

While users have begun drumming their fingers as they anxiously await the device’s release date, developers have been doing likewise, waiting for the visionOS software development kit. For this latter group, that day is here.

Apple SDK
(Source: Apple)

Calling the Vision Pro simply an AR device minimizes its capabilities. Users are able to interact with digital content in their physical space using their hands, eyes, and voice. Apple calls it a spatial computer, and visionOS is the first known spatial operating system. Developers will use the visionOS SDK to create apps that blend digital content with the physical environment for new types of experiences.

Developers can start building visionOS apps using the frameworks they already know from other Apple platforms including technologies like Xcode, SwiftUI, RealityKit, ARKit, and TestFlight to create new types of apps for a range of immersion. There are four modes of presentation supported in visionOS: windows, which have depth and can showcase 3D content; volumes, which create experiences that are viewable from any angle; and spaces, which can fully immerse a user in an environment with unbounded 3D content; and immersive apps.

A new tool, available with Xcode, is Reality Composer Pro, which enables developers to preview and optimize 3D models, animations, images, and sounds for the device. A visionOS simulator lets developers interact with their apps to test room layouts and lighting.

“By taking advantage of the space around the user, spatial computing unlocks new opportunities for our developers and enables them to imagine new ways to help their users connect, be productive, and enjoy new types of entertainment,” said Susan Prescott, Apple’s vice president of worldwide developer relations.

To assist developers in their work, Apple is opening developer labs across the globe, providing hands-on experience so they can test their apps on Vision Pro hardware. Support will also be available from Apple engineers. The labs will be located in Cupertino, California; London; Munich, Germany; Shanghai, China; Singapore; and Tokyo. Developers can also apply for dev kits to work directly on Vision Pro.

The visionOS SDK, along with the updated Xcode, Simulator, and Reality Composer Pro, is available now for members of the Apple Developer Program. Other developers can apply for a dev kit starting next month.