Skip to content

Quick Start

Track your first object in under 5 minutes. Choose your path:

  • With Physical Object — Track a real object using your phone (AR) or webcam. Follows the video tutorial above.
  • Without Physical Object — Evaluate tracking on desktop using a pre-recorded sequence. No camera or device needed.

Why a Quest controller?

We use the Quest 2 controller because most AR/VR developers have one handy. However, its rounded shape and symmetry make it a moderately challenging target — objects with distinct features, flat faces, or asymmetric shapes will track more reliably.

Want to test with your own 3D model but don't have the physical object? Use the Recording & Playback guide to record a synthetic sequence from the Unity Scene View.

1. Download the Controller Model

Download the official Meta Quest controller 3D models:

Meta Quest Controller Art

Extract the downloaded ZIP file.

2. Import the Controller Model

  1. In the Project window, right-click in Assets and create a folder called Quickstart, with a subfolder called Models
  2. From the extracted ZIP, find quest2_controllers_div0.fbx and drag it into Assets/Quickstart/Models
  3. Select the imported FBX and configure the Import Settings:
    • Model tab: enable Read/Write, click Apply Model Import Settings
    • Rig tab: set Animation Type to None, click Apply Rig Import Settings

3. Set Up the Scene

  1. Delete the default Main Camera from the scene
  2. Go to GameObject > XRTracker > AR Tracker (for mobile) or PC Tracker (for webcam)
  3. This creates the camera rig and an XRTrackerManager
  1. Go to GameObject > XRTracker > PC Tracker
  2. This creates a PC camera and an XRTrackerManager
  3. Download and import the sample sequence:

    Download Sample Sequence

    Import via Assets > Import Package > Custom Package. The package extracts a recording to Assets/StreamingAssets/Recordings/Quest2Controller/.

  4. Select the PC_Tracker GameObject and set:

    • Image Source to Sequence
    • Sequence Directory to Recordings/Quest2Controller

4. Register Your License

  1. Go to Tools > XR Tracker > License Registration
  2. Register for a free Developer license — no payment required
  3. The license file auto-downloads into your project and is assigned to the manager

5. Prepare the Mesh

  1. Drag quest2_controllers_div0 from the Project window into the Scene Hierarchy
  2. Right-click the instance and select Prefab > Unpack
  3. Expand the hierarchy and find right_quest2_mesh
  4. Drag right_quest2_mesh to the root of the Hierarchy
  5. Delete the remaining quest2_controllers_div0 parent — we only need the right controller mesh

Hierarchy after unpack

6. Add the TrackedBody Component

  1. Select right_quest2_mesh in the Hierarchy
  2. Click Add Component and add TrackedBody
  3. The Mesh Filters field auto-populates from the GameObject and its children — these are the meshes XRTracker will use to represent and track this object

TrackedBody added

No model generation needed

The default tracking mode is edge tracking, which works directly from the mesh geometry — no model generation step required. If you'd like to use silhouette tracking instead, see the Silhouette Tracking guide.

7. Set Up a Viewpoint

The viewpoint defines the pose from which the tracker first searches for the object. A good viewpoint shows the object from roughly the same angle the real camera will see it.

  1. In the TrackedBody inspector, find the Lifecycle section
  2. Set Initialization to Viewpoint
  3. Click the Create button next to the Viewpoint field
  1. Navigate the Scene View so the object looks roughly like it will through your camera
  2. Select the Viewpoint GameObject
  3. Use GameObject > Align With View (Ctrl+Shift+F) to snap it to the current Scene View camera

Set the Viewpoint Transform to these values (specific to the sample sequence):

X Y Z
Position 0.031 0.041 -0.297
Rotation 7.7 -8.8 -175.6

Viewpoint Transform

Tip

When tracking your own objects later, use Align With View (Ctrl+Shift+F) instead of entering values manually. The viewpoint doesn't need to be perfect — the tracker refines the pose during detection.

8. Add Outline Visualization (Optional)

Add a visual overlay to see the tracking in action:

  1. Select right_quest2_mesh
  2. Click Add Component and add TrackedBody Outline
  3. Toggle Show Internal Edges and Hide Source Mesh as desired

URP Only

If using URP, select your URP Renderer Asset and add:

  • AR Background Renderer Feature (AR path only)
  • Edge Outline Feature

If using the Built-in Render Pipeline, outlines work automatically — no extra setup needed.

9. Build & Run

Save the scene, then:

Mobile (AR)

  1. Open Build Profiles and add the current scene
  2. Select your target platform (Android or iOS) and your device
  3. Click Build and Run
  4. Allow camera access on the device when prompted
  5. Point the camera at the Quest 2 controller — the tracker will detect and lock on

Desktop (Webcam)

  1. Press Play in the Editor
  2. Point your webcam at the Quest 2 controller
  3. For best results, provide a camera calibration file — see Camera Calibration
  1. Save the scene
  2. Press Play in the Editor
  3. The sample sequence plays back and tracking starts automatically

What's Happening

  1. Detection — XRTracker renders the model from the viewpoint and searches for a match in the camera image
  2. Starting — Once detected, the tracker refines the pose over several frames to confirm a good lock
  3. Tracking — Continuous frame-by-frame pose optimization keeps the model aligned with the real object

The Tracking Quality indicator in the inspector shows how well the tracker is locked on (0.0 = lost, 1.0 = perfect).

Next Steps