Camera Sources¶
XRTracker supports multiple camera sources for different platforms and use cases. The camera source is configured via the Image Source setting on the XRTrackerManager component.
Image Source Modes¶
| Mode | Description | Platform |
|---|---|---|
| Native | C++ native capture from a connected camera | Windows (desktop) |
| Injected | External code feeds frames to the tracker (AR Foundation, custom feeders) | iOS, Android, any |
| RealSense | Intel RealSense depth camera with color + depth | Windows (desktop) |
| Sequence | Plays back a pre-recorded frame sequence | All platforms |
![]()
Note
Only one image source can be active at a time.
Native (Desktop)¶
The default for desktop development. XRTracker uses its native C++ backend to capture directly from a connected camera.
| Setting | Description |
|---|---|
| Calibrations File | JSON file in StreamingAssets with camera calibration data |
| Auto Select Camera Name | Automatically select a camera by name at startup |
| Auto Select Fallback To First | If named camera not found, use the first available |
| Target FPS | Requested frame rate (10–60, default 30) |
Camera intrinsics are loaded from the calibration file. If no calibration is provided, intrinsics are estimated from the camera's field of view.
Injected (AR Foundation / Custom Feeders)¶
In Injected mode, XRTracker does not capture frames itself. Instead, an external component feeds frames via manager.FeedFrame(). This is how AR Foundation integration works on mobile:
- An AR Foundation camera feeder captures frames from
ARCameraManagerand callsFeedFrame()each frame - Camera intrinsics are passed along with each frame (provided by the AR subsystem)
- Depth frames from
AROcclusionManagerare fed separately when available
Use Injected mode for:
- AR Foundation on iOS and Android
- Custom camera sources (e.g., network cameras, video files)
- Any scenario where you control the frame pipeline
See the AR Foundation Integration guide for mobile AR setup.
RealSense (Desktop with Depth)¶
Connects directly to Intel RealSense depth cameras (D415, D435, D455, L515) via the RealSense SDK, providing synchronized color and depth streams.
| Setting | Description |
|---|---|
| Color Resolution | 640x480 @60fps, 960x540 @60fps, 1280x720 @30fps, 1920x1080 @30fps |
| Depth Resolution | 480x270 @90fps, 640x480 @90fps, 848x480 @90fps, 1280x720 @30fps |
Warning
The RealSense SDK must be installed on the system and the camera must be connected before starting the application.
Sequence (Recorded Playback)¶
Plays back a previously recorded sequence from disk, enabling offline testing without physical hardware.
| Setting | Description |
|---|---|
| Sequence Directory | Path to the folder containing sequence.json and recorded frames |
See the Recording & Playback guide for how to create and use recordings.
Camera Intrinsics¶
All tracking modalities depend on accurate camera intrinsics (focal length, principal point). How intrinsics are obtained depends on the source:
| Source | Intrinsics |
|---|---|
| Native | Loaded from calibration JSON file, or estimated from FOV |
| Injected (AR Foundation) | Provided by the AR subsystem per frame (accurate) |
| RealSense | Read from the camera firmware (accurate) |
| Sequence | Stored in sequence.json alongside the recording |
Camera Background Display¶
To show the live camera feed behind your tracked objects, you need a background component. The approach differs between mobile and desktop:
Mobile (AR Foundation)¶
AR Foundation handles camera background rendering automatically — add the AR Camera Background component to the AR camera. No XRTracker background component is needed. See the AR Foundation guide for details, including the URP renderer feature requirement.
Desktop (Native / RealSense)¶
On desktop, XRTracker provides two components for displaying the camera feed as a background:
TrackerBackgroundRenderer¶
Renders the camera feed on a 3D plane positioned in front of the camera. The plane is sized automatically to fill the camera's view.
- Add this component to the Main Camera GameObject
- The plane is created and managed automatically
- Works with perspective cameras
TrackerBackgroundUI¶
Renders the camera feed on a Unity UI Canvas using a RawImage. This approach uses the standard UI system.
- Add this component to a Canvas in your scene
- Uses a
RawImageto display the camera texture - Works with any canvas render mode (Screen Space or World Space)
Tip
Both components are for desktop use only. On mobile with AR Foundation, the AR Camera Background component handles everything.
Tips¶
- Use Native mode for quick desktop development with a USB camera
- Use Injected mode for AR Foundation on mobile — the AR camera feeder handles everything
- Use RealSense when you need depth on desktop — depth tracking significantly improves robustness
- Record sequences early in development to build a test suite you can replay without hardware
- If tracking quality is poor on a desktop camera, provide a proper calibration file rather than relying on estimated intrinsics