Skip to content

iOS

XRTracker on iOS uses Injected mode with AR Foundation and ARKit. The AR Foundation camera feeder captures frames from the device camera and feeds them to the tracker each frame.

Requirements

Requirement Details
Unity Packages AR Foundation 6.x+, Apple ARKit XR Plugin
Build Tool Xcode (latest stable recommended)
Architecture ARM64
Scripting Backend IL2CPP only
Minimum iOS 13.0+
Native Library Static library (.a, ARM64)

Warning

Mono scripting backend is not supported on iOS. Use IL2CPP.

Setup

1. Install AR Foundation Packages

In the Unity Package Manager, install:

  • AR Foundation (6.x or later)
  • Apple ARKit XR Plugin

2. Configure XRTrackerManager

  1. Set Image Source to Injected
  2. Add an AR Session and XR Origin to your scene (standard AR Foundation setup)
  3. The AR Foundation camera feeder handles frame delivery automatically

3. Player Settings

In Edit > Project Settings > Player > iOS:

Setting Value
Camera Usage Description Required -- describe why your app uses the camera
Architecture ARM64
Scripting Backend IL2CPP
Target minimum iOS Version 13.0

Build will fail without Camera Usage Description

iOS requires a NSCameraUsageDescription string. Set this in Player Settings under Camera Usage Description. Without it, the app will crash on launch or be rejected by App Store review.

Depth Tracking (LiDAR)

Depth tracking is available on devices with a LiDAR scanner:

  • iPhone 12 Pro / Pro Max and later Pro models
  • iPad Pro (2020 and later)

To enable depth:

  1. Add an AROcclusionManager component to the AR Camera
  2. Enable Depth Tracking on each TrackedBody that should use depth
  3. The AR Foundation feeder passes depth frames to the tracker automatically

Note

On devices without LiDAR, depth tracking is unavailable. Silhouette and edge modalities work on all ARKit-compatible devices.

Camera Intrinsics

AR Foundation provides accurate per-frame camera intrinsics from ARKit. No calibration file is needed -- intrinsics are passed to the tracker with each frame automatically.

Build & Deploy

  1. File > Build Settings -- switch platform to iOS
  2. Build to an Xcode project folder
  3. Open the .xcodeproj in Xcode
  4. Set your signing team and bundle identifier
  5. Build and run on a connected device

Tip

Always test on a physical device. The iOS Simulator does not support ARKit or camera access.

Tips

  • LiDAR availability: Check AROcclusionManager.descriptor.supportsEnvironmentDepthImage at runtime to adapt your UI or tracking configuration for devices without LiDAR.
  • Thermal throttling: Sustained tracking on iOS can cause thermal throttling. Monitor ProcessInfo.thermalState and consider reducing tracking frequency or disabling expensive modalities when the device is hot.
  • First launch: The first AR session start may take a few seconds while ARKit initializes. Show a loading indicator during this time.
  • Background behavior: AR sessions pause when the app enters background. Tracking resumes automatically when returning to foreground, but the tracker will need to re-detect objects.