Android XR SDK 現已在開發人員預覽版中推出。敬請提供意見回饋!請前往
支援頁面與我們聯絡。
使用 OpenXR 進行開發
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
Android XR 支援透過 OpenXR 建構的應用程式,因為它支援 OpenXR 1.1 規格和特定供應商擴充功能。OpenXR 是一項開放標準,可讓您使用一組通用的 API,在各種 XR 裝置上建立沉浸式互動體驗。
功能
Android XR 支援多項功能,可讓您使用 OpenXR 建構應用程式,充分發揮 XR 裝置的獨特功能。這些功能包括:
- 可追蹤物品
- 支援平面偵測,可識別及追蹤環境中的平面,讓虛擬物件能根據現實世界放置,並支援錨點,可將虛擬參考點附加至現實世界中的物件或位置,確保虛擬內容即使在使用者移動時,也能維持準確的位置和方向。
- Raycasting
- 這項技術用於判斷虛擬射線與場景中物件的交會點,方便選取及操控虛擬元素等互動。
- 錨點持續性
- 可跨多個工作階段儲存及還原錨點,在環境中持續且一致地放置虛擬內容。
- 物件追蹤
- 在現實世界中追蹤滑鼠、鍵盤和其他物體。
- QR code 追蹤
- 在實體環境中追蹤 QR code,並解碼其資料。
- 深度紋理
- 生成深度地圖,提供相機與場景中物體之間的距離資訊,實現更逼真的遮蔽和互動效果。
- 透視
- 可將真實世界攝影機拍攝的影像與虛擬內容混合,打造混合實境體驗,無縫結合實體和數位世界。
- 場景網格化
- 取得環境的 3D 網格,可用於物理、遮蔽和其他世界感知互動。
- 組合圖層直通
- 可進行多邊形傳遞合成,並剪裁圖層,用於將真實世界物件帶入場景。
- 臉部追蹤
- 追蹤使用者臉部特徵,製作更逼真且表情豐富的虛擬人偶和虛擬角色。
- 眼動追蹤
- 提供使用者眼睛的位置和方向,讓虛擬人偶的眼睛姿勢更逼真。
- 手部追蹤
- 追蹤使用者雙手的位置和動作。
- 手部網格
- 以低多邊形網格準確呈現使用者雙手。這項功能經過最佳化,可將資料從平台傳輸至應用程式,確保您獲得最佳效能。這是其他使用繫結姿勢和混合權重的擴充功能的替代方案。
- 光線估算
- 用於照明模型,以配合使用者的實際照明條件。
Android XR 也支援下列輸入裝置。
- 手部互動
- 辨識特定手勢,例如捏合、滑動和指向,讓使用者透過手勢和手部動作與虛擬物件互動。
- 眼球注視互動
- 追蹤使用者眼球移動的能力,讓他們能透過目光選取虛擬物件並與之互動。
- 6DoF 動態控制器
- 能夠追蹤控制器的位置和動作,以及 D-pad 和按鈕繫結,以便在應用程式中觸發動作或懸停事件。
- 滑鼠互動
- 使用者可在 3D 空間中透過滑鼠指標與物件互動
Android XR 支援下列與效能相關的功能。
- 眼動追蹤凹凸感
- 允許應用程式僅在眼睛焦點處顯示高解析度內容。
- 空間跳躍
- 使用速度向量和深度紋理資訊生成中間影格,有效提升影格速率,讓使用者沉浸在體驗中
- 成效指標
- 在目前 XR 裝置、合成器和 XR 應用程式的執行階段,提供 Android XR 效能指標。包括 CPU 影格時間、GPU 影格時間、GPU 使用率、CPU 頻率、每秒影格數和更多。
如需支援的功能和擴充功能完整清單,請參閱 OpenXR 功能總覽。
支援的引擎
Unity
Android XR 的 Unity 支援功能是以 OpenXR 為基礎建構,可讓開發人員使用 Unity 6 建立體驗。如要進一步瞭解如何使用 Unity 建構 XR 應用程式,請參閱 Unity 總覽。
OpenXR™ 和 OpenXR 標誌是 The Khronos Group Inc. 的商標,已在中國、歐盟、日本和英國註冊為商標。
這個頁面中的內容和程式碼範例均受《內容授權》中的授權所規範。Java 與 OpenJDK 是 Oracle 和/或其關係企業的商標或註冊商標。
上次更新時間:2025-07-30 (世界標準時間)。
[null,null,["上次更新時間:2025-07-30 (世界標準時間)。"],[],[],null,["# Develop with OpenXR\n\nAndroid XR supports apps built with [OpenXR](https://www.khronos.org/openxr/) through its support\nfor the [OpenXR 1.1 specification and select vendor extensions](https://registry.khronos.org/OpenXR/specs/1.1/html/xrspec.html).\nOpenXR is an open standard that lets you create immersive and interactive\nexperiences using a common set of APIs across a wide range of XR devices.\n\nFeatures\n--------\n\nAndroid XR supports features that allow you to build apps that take full\nadvantage of the unique capabilities of XR devices, using OpenXR. These features\ninclude the following.\n\nTrackables\n: Supports *plane detection* , which is the ability to identify and\n track flat surfaces within the environment, enabling the placement of\n virtual objects in relation to the real world and *Anchors* which are\n virtual points of reference that can be attached to real-world objects or\n locations, ensuring that virtual content remains accurately positioned and\n oriented even as the user moves around.\n\nRaycasting\n: A technique used to determine the intersection point between a\n virtual ray and objects in the scene, facilitating interactions such as\n selecting and manipulating virtual elements.\n\nAnchor persistence\n: The capability to save and restore anchors across multiple\n sessions, allowing for persistent and consistent placement of virtual\n content within the environment.\n\nObject tracking\n: The ability to track mouse, keyboard and other objects in the\n real-world.\n\nQR Code tracking\n: The ability to track QR Codes in the physical environment and decode\n their data.\n\nDepth textures\n: The generation of depth maps that provide information about the\n distance between the camera and objects in the scene, enabling more\n realistic occlusion and interaction effects.\n\nPassthrough\n: The ability to blend real-world camera footage with virtual\n content, creating a mixed reality experience that seamlessly combines the\n physical and digital worlds.\n\nScene meshing\n: The ability to acquire a 3D mesh of the environment, which can be\n used for physics, occlusion, and other world-aware interactions.\n\nComposition layer passthrough\n: Allows for a polygon passthrough composition\n layer cutout, can be used for bringing real world objects into a scene.\n\nFace tracking\n: The ability to track the features of the user's face, enabling\n the creation of more realistic and expressive avatars and virtual\n characters.\n\nEye tracking\n: Provides position and orientation of the user's eye, which is\n designed to make eye pose for avatars more realistic.\n\nHand tracking\n: The ability to track the position and movement of the user's hands.\n\nHand mesh\n: Provides an accurate representation of the user's hands as a low\n poly mesh. Optimized for platform-to-application delivery to make sure you\n get the best performance possible. This is an alternative to other\n extensions which use a bind pose and blend weights.\n\nLight estimation\n: Used for lighting models to match the user's real world lighting conditions.\n\nSupported input devices\n-----------------------\n\nAndroid XR also supports the following input devices.\n\nHand Interaction\n: The recognition of specific hand gestures, such as\n pinching, swiping, and pointing, enabling the users to interact with virtual\n objects using gestures and hand movements.\n\nEye Gaze Interaction\n: The ability to track the user's eye movements,\n allowing them to select and interact with virtual objects using their gaze.\n\n6DoF Motion Controllers\n: The ability to track the controllers position and\n movement along with Dpad and button bindings for triggering actions, or\n hover events within the application.\n\nMouse Interaction\n: The ability for users to interact with objects through a\n mouse pointer in 3D space\n\nSupported performance features\n------------------------------\n\nAndroid XR supports the following performance-related features.\n\nEye-tracked foveation\n: Allows an app to render higher resolution content only\n at the eyes focal point.\n\nSpace warp\n: Uses velocity vectors and depth texture information to\n generate tween frames which effectively boosts the framerate required to\n keep your users immersed in your experiences\n\nPerformance metrics\n: Provides Android XR performance metrics at runtime of\n the current XR device, compositor, and XR application. This includes cpu\n frametime, gpu frame time, gpu utilization, cpu frequency, frames per second\n and [more](/develop/xr/openxr/extensions/XR_ANDROID_performance_metrics).\n\nSee the [OpenXR Feature Overview](/develop/xr/openxr/extensions) for a full list of supported features and\nextensions.\n\nSupported engines\n-----------------\n\n| **Note:** The [Android XR emulator](/develop/xr/jetpack-xr-sdk/studio-tools#android-xr) is not supported for Unity or OpenXR apps.\n\nUnity\n-----\n\nAndroid XR's Unity support, built on top of OpenXR, allows developers to create\nexperiences using Unity 6. Learn more about building XR apps with Unity in the\n[Unity overview](/develop/xr/unity).\n\n*** ** * ** ***\n\nOpenXR™ and the OpenXR logo are trademarks owned\nby The Khronos Group Inc. and are registered as a trademark in China,\nthe European Union, Japan and the United Kingdom."]]