Android XR SDK 现已推出开发者预览版。我们期望收到您的反馈!请访问我们的
支持页面与我们联系。
使用 OpenXR 进行开发
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
Android XR 通过支持 OpenXR 1.1 规范和部分供应商扩展程序,支持使用 OpenXR 构建的应用。OpenXR 是一项开放标准,可让您使用一组通用的 API 在各种 XR 设备上打造沉浸式互动体验。
功能
Android XR 支持多种功能,可让您使用 OpenXR 构建充分利用 XR 设备独特功能的应用。这些功能包括以下各项。
- 可追踪物品
- 支持平面检测,即能够识别和跟踪环境中的平面,从而能够相对于现实世界放置虚拟对象;还支持锚点,即可以附加到现实世界对象或位置的虚拟参考点,即使在用户四处移动时,也能确保虚拟内容保持准确的位置和方向。
- 光线投放
- 一种用于确定虚拟光线与场景中对象之间的交点的技术,可实现选择和操纵虚拟元素等互动。
- 锚点持久性
- 能够在多个会话中保存和恢复锚点,从而在环境中持久且一致地放置虚拟内容。
- 对象跟踪
- 能够在现实世界中追踪鼠标、键盘和其他对象。
- 二维码跟踪
- 能够在实体环境中跟踪二维码并解码其数据。
- 深度纹理
- 生成深度图,提供有关相机与场景中对象之间距离的信息,从而实现更逼真的遮挡和互动效果。
- 透传 (Passthrough)
- 能够将现实世界的摄像头拍摄的画面与虚拟内容融合,打造无缝结合物理世界和数字世界的混合现实体验。
- 场景网格化
- 能够获取环境的 3D 网格,可用于物理、遮挡和其他感知世界的互动。
- 组合层直通
- 允许进行多边形透传合成层剪切,可用于将现实世界中的对象带入场景中。
- 面部跟踪
- 能够跟踪用户面部特征,从而创建更逼真、更富有表现力的头像和虚拟角色。
- 眼动追踪
- 提供用户眼睛的位置和方向,旨在使头像的眼睛姿势更加逼真。
- 手部跟踪
- 能够跟踪用户手部的位置和运动。
- 手部网格
- 以低多边形网格的形式准确表示用户的手。针对平台到应用的交付进行了优化,以确保您获得尽可能出色的性能。这是使用绑定姿势和混合权重的其他扩展程序的替代方案。
- 光估测
- 用于照明模型,以匹配用户的真实世界照明条件。
Android XR 还支持以下输入设备。
- 手部互动
- 识别特定的手势,例如捏合、滑动和指点,使用户能够通过手势和手部动作与虚拟对象互动。
- 眼动交互
- 能够跟踪用户的眼球运动,让用户能够通过目光选择虚拟对象并与之互动。
- 6DoF 运动控制器
- 能够跟踪控制器的位置和移动,以及用于触发应用内操作或悬停事件的方向键和按钮绑定。
- 鼠标互动
- 用户通过 3D 空间中的鼠标指针与对象互动的功能
Android XR 支持以下与性能相关的功能。
- 眼动追踪的注视点渲染
- 允许应用仅在眼睛焦点处呈现更高分辨率的内容。
- 空间扭曲
- 使用速度矢量和深度纹理信息生成中间帧,从而有效提升帧速率,让用户沉浸在您的体验中
- 效果指标
- 在当前 XR 设备、合成器和 XR 应用的运行时提供 Android XR 性能指标。这包括 CPU 帧时间、GPU 帧时间、GPU 利用率、CPU 频率、每秒帧数和更多。
如需查看支持的功能和扩展程序的完整列表,请参阅 OpenXR 功能概览。
支持的引擎
Unity
Android XR 的 Unity 支持基于 OpenXR 构建,可让开发者使用 Unity 6 创建体验。如需详细了解如何使用 Unity 构建 XR 应用,请参阅 Unity 概览。
OpenXR™ 和 OpenXR 徽标是 The Khronos Group Inc. 拥有的商标,已在中国、欧盟、日本和英国注册为商标。
本页面上的内容和代码示例受内容许可部分所述许可的限制。Java 和 OpenJDK 是 Oracle 和/或其关联公司的注册商标。
最后更新时间 (UTC):2025-07-30。
[null,null,["最后更新时间 (UTC):2025-07-30。"],[],[],null,["# Develop with OpenXR\n\nAndroid XR supports apps built with [OpenXR](https://www.khronos.org/openxr/) through its support\nfor the [OpenXR 1.1 specification and select vendor extensions](https://registry.khronos.org/OpenXR/specs/1.1/html/xrspec.html).\nOpenXR is an open standard that lets you create immersive and interactive\nexperiences using a common set of APIs across a wide range of XR devices.\n\nFeatures\n--------\n\nAndroid XR supports features that allow you to build apps that take full\nadvantage of the unique capabilities of XR devices, using OpenXR. These features\ninclude the following.\n\nTrackables\n: Supports *plane detection* , which is the ability to identify and\n track flat surfaces within the environment, enabling the placement of\n virtual objects in relation to the real world and *Anchors* which are\n virtual points of reference that can be attached to real-world objects or\n locations, ensuring that virtual content remains accurately positioned and\n oriented even as the user moves around.\n\nRaycasting\n: A technique used to determine the intersection point between a\n virtual ray and objects in the scene, facilitating interactions such as\n selecting and manipulating virtual elements.\n\nAnchor persistence\n: The capability to save and restore anchors across multiple\n sessions, allowing for persistent and consistent placement of virtual\n content within the environment.\n\nObject tracking\n: The ability to track mouse, keyboard and other objects in the\n real-world.\n\nQR Code tracking\n: The ability to track QR Codes in the physical environment and decode\n their data.\n\nDepth textures\n: The generation of depth maps that provide information about the\n distance between the camera and objects in the scene, enabling more\n realistic occlusion and interaction effects.\n\nPassthrough\n: The ability to blend real-world camera footage with virtual\n content, creating a mixed reality experience that seamlessly combines the\n physical and digital worlds.\n\nScene meshing\n: The ability to acquire a 3D mesh of the environment, which can be\n used for physics, occlusion, and other world-aware interactions.\n\nComposition layer passthrough\n: Allows for a polygon passthrough composition\n layer cutout, can be used for bringing real world objects into a scene.\n\nFace tracking\n: The ability to track the features of the user's face, enabling\n the creation of more realistic and expressive avatars and virtual\n characters.\n\nEye tracking\n: Provides position and orientation of the user's eye, which is\n designed to make eye pose for avatars more realistic.\n\nHand tracking\n: The ability to track the position and movement of the user's hands.\n\nHand mesh\n: Provides an accurate representation of the user's hands as a low\n poly mesh. Optimized for platform-to-application delivery to make sure you\n get the best performance possible. This is an alternative to other\n extensions which use a bind pose and blend weights.\n\nLight estimation\n: Used for lighting models to match the user's real world lighting conditions.\n\nSupported input devices\n-----------------------\n\nAndroid XR also supports the following input devices.\n\nHand Interaction\n: The recognition of specific hand gestures, such as\n pinching, swiping, and pointing, enabling the users to interact with virtual\n objects using gestures and hand movements.\n\nEye Gaze Interaction\n: The ability to track the user's eye movements,\n allowing them to select and interact with virtual objects using their gaze.\n\n6DoF Motion Controllers\n: The ability to track the controllers position and\n movement along with Dpad and button bindings for triggering actions, or\n hover events within the application.\n\nMouse Interaction\n: The ability for users to interact with objects through a\n mouse pointer in 3D space\n\nSupported performance features\n------------------------------\n\nAndroid XR supports the following performance-related features.\n\nEye-tracked foveation\n: Allows an app to render higher resolution content only\n at the eyes focal point.\n\nSpace warp\n: Uses velocity vectors and depth texture information to\n generate tween frames which effectively boosts the framerate required to\n keep your users immersed in your experiences\n\nPerformance metrics\n: Provides Android XR performance metrics at runtime of\n the current XR device, compositor, and XR application. This includes cpu\n frametime, gpu frame time, gpu utilization, cpu frequency, frames per second\n and [more](/develop/xr/openxr/extensions/XR_ANDROID_performance_metrics).\n\nSee the [OpenXR Feature Overview](/develop/xr/openxr/extensions) for a full list of supported features and\nextensions.\n\nSupported engines\n-----------------\n\n| **Note:** The [Android XR emulator](/develop/xr/jetpack-xr-sdk/studio-tools#android-xr) is not supported for Unity or OpenXR apps.\n\nUnity\n-----\n\nAndroid XR's Unity support, built on top of OpenXR, allows developers to create\nexperiences using Unity 6. Learn more about building XR apps with Unity in the\n[Unity overview](/develop/xr/unity).\n\n*** ** * ** ***\n\nOpenXR™ and the OpenXR logo are trademarks owned\nby The Khronos Group Inc. and are registered as a trademark in China,\nthe European Union, Japan and the United Kingdom."]]