AR Foundation Integration¶
XRTracker integrates with Unity's AR Foundation for object tracking on iOS and Android. AR Foundation supplies the camera feed, depth data, and SLAM-based world tracking that XRTracker builds on.
Requirements¶
| Requirement | Version |
|---|---|
| Unity | 6.0+ |
| AR Foundation | 6.x+ |
| ARKit XR Plugin | 6.x+ (iOS) |
| ARCore XR Plugin | 6.x+ (Android) |
Required Packages¶
Install these via the Unity Package Manager:
com.unity.xr.arfoundationcom.unity.xr.arkit(iOS) orcom.unity.xr.arcore(Android)com.formulaxr.tracker
Scene Setup¶
The fastest way to get started is GameObject > XRTracker > AR Tracker. This creates a pre-configured hierarchy with all required components:
AR_Tracker
├── AR Session (ARSession, ARInputManager)
└── XR Origin (XROrigin, ARAnchorManager, ARPlaneManager, ARRaycastManager)
└── Camera Offset
└── Main Camera (Camera, ARCameraManager, ARCameraBackground,
TrackedPoseDriver, ARFoundationCameraFeeder)
The root AR_Tracker has XRTrackerManager (Image Source = Injected) and TrackedBodyManager. Add TrackedBody components for each object you want to track.
Manual setup
If you prefer to set up from scratch:
- AR Session — Create a GameObject with
ARSessionandARInputManager - XR Origin — Create an
XR Originwith aCamera Offsetchild containing theMain Camera. AddARAnchorManager,ARPlaneManager, andARRaycastManagerto the XR Origin GameObject (no prefabs needed — these run invisibly for anchor placement) - On the Main Camera, add:
ARCameraManager,ARCameraBackground,TrackedPoseDriver, andARFoundationCameraFeeder - Create a separate GameObject with
XRTrackerManager— set Image Source to Injected and assign the Main Camera - Add a
TrackedBodyManagercomponent to the same GameObject
Warning
The ARSession must be active before XRTracker starts. If the AR session fails to initialize, tracking will not receive camera frames.
Camera Feed¶
The ARFoundationCameraFeeder component captures frames from ARCameraManager and feeds them to the tracker automatically each frame. In the AR Tracker prefab, this is already on the Main Camera alongside the ARCameraManager.
- On iOS, ARKit provides frames from the wide-angle camera
- On Android, ARCore provides frames from the main camera
- Camera intrinsics (focal length, principal point) are supplied automatically by the subsystem per frame
Camera Background¶
AR Foundation handles the camera background rendering on its own — add the AR Camera Background component to the AR camera (this is standard AR Foundation setup). No additional XRTracker background components are needed on mobile.
URP setup
When using URP, the AR Camera Background requires the AR Background Renderer Feature added to your URP Renderer. Without it, the camera feed won't render. Check Edit > Project Settings > Graphics > URP Renderer > Add Renderer Feature > AR Background Renderer Feature.
Depth from LiDAR and ToF¶
When a depth sensor is available, XRTracker can use it to enable depth-based tracking:
| Device | Sensor | Depth Source |
|---|---|---|
| iPhone Pro / iPad Pro | LiDAR | AROcclusionManager |
To enable depth:
- Add an
AROcclusionManagercomponent to the AR Camera - Enable Depth Tracking on the TrackedBody inspector
- XRTracker will automatically use the depth frames when available
Note
Depth tracking is optional. Silhouette and edge modalities work with color-only cameras. Depth improves robustness when available but is not required.
AR Pose Fusion (Stationary Objects)¶
AR Pose Fusion combines AR Foundation's SLAM world tracking with XRTracker's object tracking to produce stable AR overlays. Without it, tracked objects may drift or jitter relative to the real world.
How It Works¶
- SLAM (AR Foundation) provides stable world-space camera tracking
- Object tracking (XRTracker) provides precise object-relative pose
- Pose fusion blends both signals: SLAM anchors the object in world space while object tracking provides local accuracy
Requirements¶
AR Pose Fusion requires several AR managers on the XR Origin GameObject. The AR Tracker menu item (GameObject > XRTracker > AR Tracker) adds these automatically. If you set up your scene manually, add them yourself:
| Component | Purpose | Required? |
|---|---|---|
ARAnchorManager |
Creates and tracks the session anchor | Yes |
ARPlaneManager |
Detects surfaces for stable anchor placement | Recommended |
ARRaycastManager |
Raycasts against detected geometry for anchor placement | Recommended |
No prefabs are needed for ARPlaneManager or ARRaycastManager — they run invisibly in the background. XRTracker uses them at anchor creation time to find a feature-rich surface near the tracked object, which significantly improves anchor stability.
Without ARAnchorManager, pose fusion falls back to world-space tracking, which is susceptible to drift when ARKit/ARCore refines its map. Without ARPlaneManager/ARRaycastManager, the anchor is placed at the object's center, which may be on a featureless surface.
Improving Anchor Stability with ARMeshManager (iOS LiDAR)¶
On devices with a LiDAR sensor (iPhone Pro, iPad Pro), adding an ARMeshManager to the XR Origin enables scene reconstruction. This gives ARKit richer 3D geometry to constrain the anchor, improving stability even on visually textureless surfaces.
- Add
ARMeshManagerto the XR Origin GameObject - Assign a mesh prefab with
MeshFilterandMeshCollider(no renderer needed — the mesh stays invisible)
Note
ARMeshManager is iOS-only (LiDAR). Android ToF sensors provide depth via AROcclusionManager but do not support scene mesh reconstruction through ARFoundation.
Enabling Stationary Mode¶
On the TrackedBody component:
- Enable the Is Stationary checkbox — marks this object as fixed in the real world
- Smooth Time — controls how quickly pose corrections are blended in (seconds, default 0.1)
On the XRTrackerManager:
- Use AR Pose Fusion must be enabled (default: on when AR Foundation is available)
Tip
AR Pose Fusion is most useful for stationary objects (e.g., a product on a table, machinery on a floor). For objects that move in the real world, leave Is Stationary off.
State Machine¶
When a stationary body is tracking, it follows this state machine:
| State | Behavior |
|---|---|
| Stabilizing | Applies native tracking pose for ~30 frames to settle |
| Anchored | Object stays in place via SLAM. Tracker corrections are blended in smoothly when quality is good |
| Recovery | Tracking lost — holds position, feeds current world pose back to tracker until quality recovers |
Tips¶
- Always include an ARSession in the scene — without it, AR Foundation will not initialize
- Test on device — AR Foundation features (especially depth) behave differently than in the Editor
- On iOS, ensure Camera Usage Description is set in Player Settings
- On Android, enable ARCore Required in XR Plug-in Management if your app requires AR
- If depth is noisy or unavailable on certain Android devices, rely on silhouette or edge modalities instead