iOS¶
XRTracker on iOS uses Injected mode with AR Foundation and ARKit. The AR Foundation camera feeder captures frames from the device camera and feeds them to the tracker each frame.
Requirements¶
| Requirement | Details |
|---|---|
| Unity Packages | AR Foundation 6.x+, Apple ARKit XR Plugin |
| Build Tool | Xcode (latest stable recommended) |
| Architecture | ARM64 |
| Scripting Backend | IL2CPP only |
| Minimum iOS | 13.0+ |
| Native Library | Static library (.a, ARM64) |
Warning
Mono scripting backend is not supported on iOS. Use IL2CPP.
Setup¶
1. Install AR Foundation Packages¶
In the Unity Package Manager, install:
- AR Foundation (6.x or later)
- Apple ARKit XR Plugin
2. Configure XRTrackerManager¶
- Set Image Source to
Injected - Add an AR Session and XR Origin to your scene (standard AR Foundation setup)
- The AR Foundation camera feeder handles frame delivery automatically
3. Player Settings¶
In Edit > Project Settings > Player > iOS:
| Setting | Value |
|---|---|
| Camera Usage Description | Required -- describe why your app uses the camera |
| Architecture | ARM64 |
| Scripting Backend | IL2CPP |
| Target minimum iOS Version | 13.0 |
Build will fail without Camera Usage Description
iOS requires a NSCameraUsageDescription string. Set this in Player Settings under Camera Usage Description. Without it, the app will crash on launch or be rejected by App Store review.
Depth Tracking (LiDAR)¶
Depth tracking is available on devices with a LiDAR scanner:
- iPhone 12 Pro / Pro Max and later Pro models
- iPad Pro (2020 and later)
To enable depth:
- Add an AROcclusionManager component to the AR Camera
- Enable Depth Tracking on each
TrackedBodythat should use depth - The AR Foundation feeder passes depth frames to the tracker automatically
Note
On devices without LiDAR, depth tracking is unavailable. Silhouette and edge modalities work on all ARKit-compatible devices.
Camera Intrinsics¶
AR Foundation provides accurate per-frame camera intrinsics from ARKit. No calibration file is needed -- intrinsics are passed to the tracker with each frame automatically.
Build & Deploy¶
- File > Build Settings -- switch platform to iOS
- Build to an Xcode project folder
- Open the
.xcodeprojin Xcode - Set your signing team and bundle identifier
- Build and run on a connected device
Tip
Always test on a physical device. The iOS Simulator does not support ARKit or camera access.
Tips¶
- LiDAR availability: Check
AROcclusionManager.descriptor.supportsEnvironmentDepthImageat runtime to adapt your UI or tracking configuration for devices without LiDAR. - Thermal throttling: Sustained tracking on iOS can cause thermal throttling. Monitor
ProcessInfo.thermalStateand consider reducing tracking frequency or disabling expensive modalities when the device is hot. - First launch: The first AR session start may take a few seconds while ARKit initializes. Show a loading indicator during this time.
- Background behavior: AR sessions pause when the app enters background. Tracking resumes automatically when returning to foreground, but the tracker will need to re-detect objects.