Develop with Unity for Android XR

This guide provides an overview of developing with Unity for Android XR. Android XR works with the familiar tools and features you've come to expect from Unity, and since Unity's Android XR support is built on top of OpenXR, many of the features described in the OpenXR Overview are also supported in Unity.

Follow this guide to learn about:

  • Unity Support for Android XR
    • Unity XR basics
    • Developing and publishing apps for Android XR
    • Unity Packages for Android XR
      • Unity OpenXR: Android XR package
      • Android XR Extensions for Unity
      • Features and Compatibility Considerations
  • Input and interaction

Unity Support for Android XR

When you build Unity apps for Android XR, you can take advantage of the mixed reality tools and capabilities in Unity 6. This includes mixed reality templates that use the XR Interaction Toolkit, AR Foundation, and OpenXR Plugin, to help you get started quickly. When building apps with Unity for Android XR, we recommend the Universal Render Pipeline (URP) as your render pipeline and Vulkan as your Graphics API. These features allow you to take advantage of some of the newer graphics features of Unity, which are only supported with Vulkan. Review the project setup guide for more information on how to configure these settings.

Unity XR Basics

If you are new to Unity or XR development, you can refer to Unity's XR Manual to understand basic XR concepts and workflows. The XR Manual contains information about:

Developing and publishing apps for Android

Unity provides in-depth documentation for developing, building and publishing for Android, covering topics including Android permissions in Unity, Android Build Settings, Building your app for Android, and Delivering to Google Play.

Unity Packages for Android XR

There are two packages that provide support for building Unity apps for Android XR. Both of these packages are XR provider plug-ins, which can be enabled through Unity's XR Plug-in Management package. The XR plug-in manager adds Project Settings for managing and offering help with loading, initialization, settings, and build support for XR plug-ins. To allow your app to execute OpenXR features at runtime, the project must have these features enabled through the plug-in manager.

This image shows an example of where you can enable these feature groups through Unity's editor.

Example of the unity xr plugin management screen

Unity OpenXR Android XR

The Unity OpenXR Android XR package is an XR Plug-in to add Android XR support to Unity. This XR Plug-in provides the majority of the Android XR support for Unity, and it enables Android XR device support for AR Foundation projects. AR Foundation is designed for developers who want to create AR or mixed reality experiences. It provides the interface for AR features, but doesn't implement any features itself. The Unity OpenXR Android XR package provides the implementation. To get started with this package view the package manual, which contains a Getting Started guide.

Android XR Extensions for Unity

The Android XR Extensions for Unity supplements the Unity OpenXR Android XR package, and it includes additional features to help you build immersive experiences. It can be used alone or together with the Unity OpenXR Android XR package.

To get started with this package, follow our project setup guide or quickstart for importing Android XR Extensions for Unity.

Features and Compatibility Considerations

The following table describes the features supported by the Unity OpenXR: Android XR package and the Android XR Extensions for Unity package, and it can be used to determine which package contains the features you need and any compatibility considerations.

Feature

Unity OpenXR: Android XR feature string

Android XR Extensions for Unity feature string

Use cases and expected behavior

AR Session

Android XR: AR Session

  • Feature settings include Optimize Buffer Discards (Vulkan)

Android XR (Extensions): Session Management

  • Feature settings include Subsampling (Vulkan) and URP SpaceWarp (Vulkan)

To use features from either package, you must enable the AR Session feature for that package. You can enable both feature sets at the same time; individual features will handle conflicts accordingly.

Device tracking

n/a

n/a

Device tracking is used to track the device's position and rotation in physical space. The XR Origin GameObject automatically handles device tracking and transforming trackables into Unity's coordinate system via its XROrigin component and GameObject hierarchy with a Camera and TrackedPoseDriver.

Camera

Android XR: AR Camera

n/a

This feature provides support for light estimation and full screen passthrough.

Plane detection

Android XR: AR Plane

Android XR (Extensions): Plane

These two features are identical; use one or the other. Android XR (Extensions): Plane is included so that developers can use the Android XR (Extensions): Object Tracking and persistent anchors features without having to have a dependency on the Unity OpenXR Android XR package. In the future, Android XR (Extensions): Plane will be removed in favor of Android XR: AR Anchor.

Object tracking

n/a

Android XR (Extensions): Object Tracking

This feature provides support for detecting and tracking objects in the physical environment, used in combination with a reference object library.

Face tracking

Android XR: AR Face

  • XR_ANDROID_avatar_eyes only
  • No face tracking

Android XR: Face Tracking

  • XR_ANDROID_face_tracking

Avatar eyes support is provided through the Android XR: AR Face feature. Access a user's facial expressions through the Android XR: Face Tracking feature. These two features can be used together, if desired.

Ray casts

Android XR: AR Raycast

  • Plane Anchor
  • Depth Anchor

n/a

This feature allows you to cast a ray and calculate the intersection between that ray and plane trackables or depth trackables that are detected in the physical environment.

Anchors

Android XR: AR Anchor

Android XR (Extensions): Anchor

  • Feature settings include persistence.

Both features include support for spatial anchors and plane anchors; use one feature or the other. For persistent anchors, use Android XR (Extensions): Anchor. In the future, Android XR (Extensions): Anchor will be removed and all Anchor features will be in Android XR: AR Anchor.

Occlusion

Android XR: AR Occlusion

  • Environment Depth

n/a

Occlusion allows mixed reality content in your app to appear hidden or partially obscured behind objects in the physical environment.

Performance Metrics

Android XR Performance Metrics

n/a

Use this feature to access performance metrics for Android XR devices.

Composition Layers

Composition Layer Support (OpenXR Plugin and XR Composition Layer are required)

Android XR: Passthrough Composition Layer

  • XR_ANDROID_composition_layer_passthrough_mesh

Use Unity's Composition Layer Support to create basic composition layers (e.g. quad, cylinder, projection). Android XR: Passthrough Composition Layer can be used to create a passthrough layer with a custom mesh, reading from Unity's GameObject.

Foveated Rendering

Foveated Rendering (OpenXR Plugin is required)

  • Supports eye-tracked foveated rendering: the higher resolution area is centered where the user is currently looking making it less apparent to the user

Foveation (Legacy)

Foveated rendering allows for speeding up rendering by lowering the resolution of areas in the user's peripheral vision. Unity's foveated rendering feature is only supported for apps using URP and Vulkan. The Foveation (Legacy) feature in the Android XR Extensions for Unity also supports BiRP and GLES. We recommend using Unity's foveated rendering feature when possible, and note that both URP and Vulkan are recommended when building for Android XR.

Unbounded Reference Space

n/a

Android XR: Unbounded Reference Space

This feature sets the XRInputSubsystem tracking origin mode to Unbounded. Unbounded indicates that the XRInputSubsystem tracks all InputDevices in relation to a world anchor, which can change.

Environment Blend Mode

n/a

Environment Blend Mode

This feature allows you to set the XR Environment Blend Mode, which controls how virtual imagery blends with the real-world environment when passthrough is enabled.

Input and interaction

Android XR supports multi-modal natural input.

In addition to hand and eye tracking, peripherals such as 6DoF controllers, mouse, and physical keyboard are also supported. This means that apps for Android XR are expected to support hand interaction, and it cannot be assumed that all devices will come with controllers.

Interaction Profiles

Unity uses interaction profile to manage how your XR application communicates with various XR devices and platforms. These profiles establish the expected inputs and outputs for different hardware configurations, ensuring compatibility and consistent functionality across a range of platforms. By enabling interaction profiles, you can ensure that your XR application functions correctly with different devices, maintains consistent input mapping, and has access to specific XR features. To set an interaction profile:

  1. Open the Project Settings window (menu: Edit > Project Settings).
  2. Click XR Plug-in Management to expand the plug-in section (if necessary).
  3. Select OpenXR in the list of XR plug-ins.
  4. In the Interaction Profiles section, select the + button to add a profile.
  5. Select the profile to add from the list.

Hand Interaction

Hand interaction (XR_EXT_hand_interaction) is provided by the OpenXR Plugin, and you can expose the device layout in the Unity Input System by enabling the Hand Interaction Profile. Use this interaction profile for hand input supported by the four action poses defined by OpenXR: "pinch", "poke", "aim", and "grip". If you need additional hand interaction or hand tracking functionality, refer to XR Hands on this page.

Eye Gaze Interaction

Eye gaze interaction (XR_EXT_eye_gaze_interaction) is provided by the OpenXR Plugin, and you can use this layout to retrieve the eye pose data (position and rotation) that the extension returns. Read more about eye gaze interaction in the OpenXR Input guide.

Controller Interaction

Android XR supports the Oculus Touch Controller Profile for 6DoF controllers. Both of these profiles are provided by the OpenXR Plugin.

Mouse Interaction

The Android XR Mouse Interaction Profile (XR_ANDROID_mouse_interaction) is provided by the Android XR Extensions for Unity. It exposes an <AndroidXRMouse> device layout in the Unity Input System.

Palm Pose Interaction

The OpenXR Plugin provides support for the Palm Pose Interaction (XR_EXT_palm_pose), which exposes the <PalmPose> layout within the Unity Input System. Palm pose is not meant to be an alternative to extensions or packages that perform hand tracking for more complex use cases; instead it can be used to place app-specific visual content such as avatar visuals. The palm pose consists of both palm position and orientation.

XR Hands

The XR Hands package allows you to access hand tracking data (XR_EXT_hand_tracking and XR_FB_hand_tracking_aim) and provides a wrapper to convert hand joint data from hand tracking to input poses. To use the features provided by the XR Hands package, enable the Hand Tracking Subsystem and Meta Hand Tracking Aim OpenXR features.

Example showing how to enable hand tracking

The XR hands package can be useful if you need more granular hand pose or hand joint data or when you need to work with custom gestures.

For more details, see Unity's documentation for setting up XR Hands in your project

Choose a way to render hands

Android XR supports two ways of rendering hands: a hand mesh and a prefab visualizer.

Hand mesh

The Android XR Unity package contains a Hand Mesh feature that provides access to the XR_ANDROID_hand_mesh extension. The Hand Mesh feature provides meshes for the user's hands. The hand mesh contains vertices of triangles that represent the geometry of a hand. This feature is intended to be used to provide a personalized mesh representing the real geometry of the user's hands for visualization.

XR Hands prefab

The XR Hands package contains a sample called Hands visualizer, which contains fully rigged left and right hands for rendering context-appropriate representation of the user's hands.

System gestures

Android XR includes a system gesture to open a menu for users to go back, open the launcher or get an overview of currently running applications. The user can activate this system menu by using a dominant-hand pinch.

When the user is interacting with the system navigation menu, the application will only respond to head tracking events. The XR Hands package can detect when a user performs specific actions such as interacting with this system navigation menu. Checking for AimFlags SystemGesture and DominantHand will let you know when this system action is performed. For more information on AimFlags, refer to Unity's Enum MetaAimFlags documentation.

XR Interaction Toolkit

The XR Interaction Toolkit package is a high-level, component-based, interaction system for creating VR and AR experiences. It provides a framework that makes 3D and UI interactions available from Unity input events. It supports interaction tasks including haptic feedback, visual feedback, and locomotion.