Android XR SDK Developer Preview 2 has been released with new features and improvements to assist developers in creating immersive experiences.
New features include capabilities to render 180° and 360° stereoscopic videos, layouts that adapt to different XR display configurations, and enhanced Material Design for XR components.
ARCore for Jetpack XR now supports hand tracking, offering natural input methods for XR experiences.
Updates to the Android XR Emulator provide stability, AMD GPU support, and integration with Android Studio UI.
Unity developers can benefit from performance improvements and new features in the Unity OpenXR: Android XR package, including support for Dynamic Refresh Rate and hand meshes with occlusion.
Firebase AI Logic for Unity is now in public preview, allowing developers to integrate AI into their apps for unique AI-powered experiences on Android XR.
Google continues to support open standards, collaborating on the glTF Interactivity specification and preparing for the launch of Android XR on upcoming devices.
Android XR differentiated apps can be prepared for the Android XR Play Store by testing in the emulator and creating standout assets like 180° videos and screenshots.
Developers interested in Android XR on glasses can look forward to future updates on how to engage in the developer experience.
To start developing for Android XR, visit developer.android.com/develop/xr for the necessary tools, libraries, and information, and share feedback to contribute to the development of Android XR.