Google has released a schedule for the upcoming I/O Developer Conference. This includes developer-centric sessions that promise to provide more insight into the upcoming Android XR operating system. At least from the schedule, it’s not likely that Google has yet to acquire that vocal on Android XR.
Android XR has not been in the spotlight since Google announced it in December along with Samsung’s “Project Moohan” Mixed Reality Headset. Both are expected to be released later this year, but neither has a solid release date.
The company has seen many Android XR features, including much-anticipated support for pass-through camera access, and has also released developer access to the Android XR SDK, but it is still waiting to see how it stacks up on a more mature XR ecosystem, such as Meta’s Horizon OS and Apple’s Visionos.
Google I/O includes many live streamed keynotes from May 20th to 21st, but only two developer talks dedicated to Android XR have been announced, but neither will be livestreamed. However, there is a live stream of “New Android Things” that promises to touch Android XR too.
The two developer sessions that failed meaningful information from the live stream are all about Android XR, indicating that Google wants to further integrate XR as it is further integrated into the developer-friendly Android ecosystem and further integrated from the public eye of live streamed keynotes.
This can be communicated from the content of the lecture (explanation shown below):
Android XR will move to an unveiled release later this year, and Google is preparing for it before preparing a new XR toolchain. Currently in the developer preview, JetPack XR can use 3D models and an immersive environment to create spatialized layouts using the mobile or large screen Android app Dev. In particular, including Arcore in Jetpack XR suggests that Google is integrating spatial computing tools, providing a unified way for developers to build both AR and VR experiences.
The lecture also focuses on adding XR features to existing apps such as 3D models, hand tracking, and stereoscopic videos. This means that Android XR wants to attract more than just a game developer.
Google is also configured to extend Jetpack Compose, a declarative UI toolkit, to XR. This suggests that you want to standardize your UI design across mobile, tablets and XR, making it easy to port or adapt your UI to an immersive environment.
In particular, the second lecture (shown below) highlights future AI features built into Android XR. This means that features such as real-time object recognition, scene understanding, and AI-generated environments can be on the horizon.
Certainly neither story has been livestreamed. This could mean that Google is not ready to trumpet Android XR. We would also like to hear more about Samsung’s upcoming “Project Moohan” headsets. This will be the first headset to support Android XR.
In any case, we report on technical consultations we hope to tune and listen for the live stream Something new.
Build a differentiated app for Android XR with 3D content
Dereck Bridié Developer Relations Engineer, Patrick Fuentes Developer Relations Engineer
“We introduce Jetpack Scenecore and Arcore for Jetpack XR and guide developers developing the process of adding immersive content, such as 3D models, 3D videos, and hand tracking to existing apps.
Android XR has compose and ai, and the future is now
Jan Kleinert Developer Relations Engineer, Cecilia Abadie Senior Product Manager
Discover the future of immersive experiences with Android XR. This session announces the latest updates to the Android XR SDK beta, and will be launched with I/O, including XR and JetPack composing extensions for cutting-edge AI features. Learn how to leverage existing investments in large screen development to easily expand your reach into the exciting world of Android XR.