At WWDC today, Apple announced the headline feature for the Visionos 26. This is the next big OS release for Vision Pro. Among them are new and improved spatial photography features, which should further immerse them.
The Vision Pro was launched with the ability to display spatial photos captured on either the headset itself or the iPhone 16, 15 Pro, or Pro Max. These spatial photographs produced a sense of depth and dimension by combining stereo capture and applying depth mapping to the images.
Now, Apple has applied a new generative AI algorithm to “create spatial scenes with multiple perspectives, allowing users to feel they can lean and look around.”
Visionos 26 allows Vision Pro users to view spatial scenes in the Photos app, Spatial Gallery app, and Safari. The company says developers can add functionality to their apps using the spatial scene API.
To show off the new AI-Assisted Spatial Photos Feature, Real-Estate Marketplace Zillow says it adopts a spatial scene API Zillow Immersive App for Vision Pro allows users to view images of their homes and apartments.
Apple’s Visionos 26 is expected to arrive later this year, but the company says testing is already underway.