Skip to content

AR Foundation Integration

XRTracker integrates with Unity's AR Foundation to provide 6DoF object tracking on iOS and Android. AR Foundation supplies the camera feed, depth data, and SLAM-based world tracking that XRTracker builds on.

Requirements

Requirement Version
Unity 6.0+
AR Foundation 6.x+
ARKit XR Plugin 6.x+ (iOS)
ARCore XR Plugin 6.x+ (Android)

Required Packages

Install these via the Unity Package Manager:

  • com.unity.xr.arfoundation
  • com.unity.xr.arkit (iOS) or com.unity.xr.arcore (Android)
  • com.formulaxr.tracker

Scene Setup

AR Scene Hierarchy

Your scene needs the standard AR Foundation hierarchy plus the XRTracker components:

  1. AR Session — Add an ARSession component to a GameObject
  2. XR Origin — Add an XR Origin with an AR Camera Manager on the camera
  3. XRTrackerManager — Set the Image Source to Injected (AR Foundation feeds frames to the tracker)
  4. Add TrackedBody components for each object you want to track
Scene Hierarchy:
├── AR Session
├── XR Origin
│   └── Camera Offset
│       └── Main Camera (AR Camera Manager, AR Camera Background)
├── XRTrackerManager
└── Tracked Objects
    ├── TrackedBody_A
    └── TrackedBody_B

Warning

The ARSession must be active before XRTracker starts. If the AR session fails to initialize, tracking will not receive camera frames.

Camera Feed

Add the ARFoundationCameraFeeder component to the same GameObject as the XRTrackerManager. This component captures frames from ARCameraManager and feeds them to the tracker automatically each frame.

  • On iOS, ARKit provides frames from the wide-angle camera
  • On Android, ARCore provides frames from the main camera
  • Camera intrinsics (focal length, principal point) are supplied automatically by the subsystem per frame

Camera Background

AR Foundation handles the camera background rendering on its own — add the AR Camera Background component to the AR camera (this is standard AR Foundation setup). No additional XRTracker background components are needed on mobile.

URP setup

When using URP, the AR Camera Background requires the AR Background Renderer Feature added to your URP Renderer. Without it, the camera feed won't render. Check Edit > Project Settings > Graphics > URP Renderer > Add Renderer Feature > AR Background Renderer Feature.

Depth from LiDAR and ToF

When a depth sensor is available, XRTracker can use it to enable depth-based tracking:

Device Sensor Depth Source
iPhone Pro / iPad Pro LiDAR AROcclusionManager
Android (select devices) ToF AROcclusionManager

To enable depth:

  1. Add an AROcclusionManager component to the AR Camera
  2. Enable Depth Tracking on the TrackedBody inspector
  3. XRTracker will automatically use the depth frames when available

Note

Depth tracking is optional. Silhouette and edge modalities work with color-only cameras. Depth improves robustness when available but is not required.

AR Pose Fusion (Stationary Objects)

AR Pose Fusion combines AR Foundation's SLAM world tracking with XRTracker's object tracking to produce stable AR overlays. Without it, tracked objects may drift or jitter relative to the real world.

How It Works

  • SLAM (AR Foundation) provides stable world-space camera tracking
  • Object tracking (XRTracker) provides precise object-relative pose
  • Pose fusion blends both signals: SLAM anchors the object in world space while object tracking provides local accuracy

Enabling Stationary Mode

On the TrackedBody component:

  • Enable the Is Stationary checkbox — marks this object as fixed in the real world
  • Smooth Time — controls how quickly pose corrections are blended in (seconds, default 0.1)

Stationary Mode Settings

On the XRTrackerManager:

  • Use AR Pose Fusion must be enabled (default: on when AR Foundation is available)

Tip

AR Pose Fusion is most useful for stationary objects (e.g., a product on a table, machinery on a floor). For objects that move in the real world, leave Is Stationary off.

State Machine

When a stationary body is tracking, it follows this state machine:

None → Stabilizing → Anchored ↔ Recovery
State Behavior
Stabilizing Applies native tracking pose for ~30 frames to settle
Anchored Object stays in place via SLAM. Tracker corrections are blended in smoothly when quality is good
Recovery Tracking lost — holds position, feeds current world pose back to tracker until quality recovers

Tips

  • Always include an ARSession in the scene — without it, AR Foundation will not initialize
  • Test on device — AR Foundation features (especially depth) behave differently than in the Editor
  • On iOS, ensure Camera Usage Description is set in Player Settings
  • On Android, enable ARCore Required in XR Plug-in Management if your app requires AR
  • If depth is noisy or unavailable on certain Android devices, rely on silhouette or edge modalities instead