The Android XR system uses interactivity models similar to those in mobile and large-screen apps to help users understand how to use XR. It includes known patterns like the home screen, apps overview, back stack, and more.
To help you build integrated and boundless experiences, Android XR provides natural gesture navigation, multimodal inputs, and new spatial and 3D capabilities.
Home Space and Full Space modes
A user can experience your app in two modes, Home Space and Full Space. In Home Space, a user is able to multitask with your app running side by side with other apps. In Full Space, your app takes center stage as the focus of the user's experience with full access to the immersive capabilities of Android XR.
Give users control over their environment
In Android XR, an environment is the real or virtual space that a user sees while wearing an XR device. It is unconstrained by the physical limitations of mobile and desktop screens.
- A spatial environment simulates a fully immersive virtual space that takes over a user's physical space. Available in Full Space only. For example, a user watches a movie in a virtual luxury cinema.
- A passthrough environment adds digital elements to a user's physical surroundings. For example, a user opens multiple large-screen apps while simultaneously seeing their real-life room.
Learn how to build spatial environments in Full Space.
System environments
Users can choose environments provided by the Android XR system. These system environments can be used in Home Space or Full Space. If an app doesn't define a specific environment, it will inherit the system environment — either in passthrough or a virtual environment.
Design with multimodal inputs
It's essential to design immersive applications that are accessible to a wide range of users. You should allow users to customize input methods to suit their individual preferences and abilities.
To help you achieve this, Android XR supports a variety of input methods, including hand and eye tracking, voice commands, Bluetooth-connected keyboards, traditional and adaptive mice, trackpads, and six degrees of freedom (6DoF) controllers. Your app should automatically work with these built-in modalities.
Make sure you provide visual or audio feedback to confirm user actions for any interaction model you choose.
Learn about design considerations for XR accessibility.
Hand tracking enables natural interactions. Most gestures should be comfortable to perform repeatedly, and not require large hand or arm movements for extended periods of time. If you add virtual hands, ensure they are accurately tracked. If you add special gestures, small localized gestures are preferable to large sweeping movements.
Voice commands are useful for hands-free interaction. Users can dictate text inputs and perform some app interactions with spoken instructions through Gemini. For example, a user might say "Open Google Maps" to open that app.
Eye tracking enables effortless interactions, such as selecting objects by looking at them. To minimize eye strain you can offer alternative input methods.
Peripheral devices. Android XR supports external devices like a Bluetooth keyboard, mouse, and 6DoF controller. For controllers, ensure intuitive button mappings, and consider allowing users to remap buttons to suit their preferences.
Understanding system gestures
Android XR extends familiar mobile actions like press, pinch, and swipe to a gesture-based navigation system.
Users navigate by facing their primary hand's palm inward, pinch and holding their index finger and thumb. Their hand moves up, down, left, or right, and releases to select an option. Users can set their primary hand preference in Input Settings.
Privacy considerations
Android's privacy recommendations apply to building XR apps. Remember to obtain user consent before collecting any personal identifiable information, limit user data collection to the essentials, and store it securely.
Follow Android XR's app quality guidelines.