This tutorial walks through setting up full-body tracking on the Meta Quest 3 in Unity using the Meta Movement SDK, then implementing a record-and-replay workflow: one scene captures body-tracked motion to a serializable format, and a second scene plays it back on a skeleton you can visualize, color-code, or overlay metrics on.
The pattern is broadly useful for any project that wants to separate the act of motion capture from the act of motion review — physical therapy and rehab, sports analysis, dance, animation prototyping, accessibility studies. For a specific application, see VR for Physical Therapy & Movement Rehabilitation.
For when this workflow inevitably breaks: Diagnosing Meta Movement SDK Retargeting Failures in Unity.
This is the single most important step. The Meta Core SDK and Meta Movement SDK have a strict version-compatibility requirement that is not enforced by Unity Package Manager. A Core SDK newer than the matching Movement SDK (or vice versa) silently corrupts retargeted joint data — segment lengths drift, limbs detach, retargeter error counts climb in the console.
A confirmed-working combination used by PT Detective:
Component
Version
Meta Core SDK
[your version]
Meta Movement SDK
[your version]
Unity
2022.3 LTS
XR Plug-in Provider
OpenXR (NOT Meta Quest XR Plugin)
Fill in the exact versions that worked for your project.
The fix when these get out of sync: manually downgrade Core SDK in Packages/manifest.json to match the Movement SDK's expected Core SDK version. The Movement SDK's release notes will list the matching Core SDK it ships against — always consult that list rather than letting Package Manager auto-resolve.
Counterintuitively, you do not use the Meta Quest XR Plugin alongside Meta Movement SDK. Use OpenXR with the Meta Quest Support feature enabled.
In Project Settings > XR Plug-in Management:
Android tab: enable OpenXR, disable Meta Quest XR Plugin if present
Under OpenXR > Feature Groups > Android:
Enable Meta Quest Support
Enable Body Tracking feature
Enable Hand Tracking Subsystem if you also need hand tracking
Leave Eye/Face Tracking off unless you actually use them — they slow startup
In Player Settings > Android:
Minimum API Level: Android 10.0 (API 29) or higher
Scripting Backend: IL2CPP
Target Architectures: ARM64 only
The capture scene's job is simple: render a minimal environment, drive a body-tracked rig from the user's movement, and serialize joint data per frame to memory or disk.
Steps:
Create a new scene; add the OVRCameraRig prefab (or equivalent OpenXR rig)
Add a body-tracked rig prefab from the Movement SDK samples — typically the retargeted humanoid skeleton found in the SDK's sample scenes
Attach a custom MotionRecorder MonoBehaviour to a manager GameObject. In LateUpdate() it samples the body-tracked rig's joint positions and rotations and appends to an in-memory list
Add a UI trigger (controller button press) for start/stop recording
On stop, serialize the in-memory list to a JSON file written to Application.persistentDataPath, or to a MotionClip ScriptableObject for in-editor testing
Code sketch (simplified):
public class MotionRecorder : MonoBehaviour {
public Transform[] trackedJoints;
private List<FrameData> frames = new();
private bool recording = false;
void LateUpdate() {
if (!recording) return;
var frame = new FrameData {
timestamp = Time.timeAsDouble,
joints = trackedJoints.Select(j => new JointSample {
pos = j.position, rot = j.rotation
}).ToArray()
};
frames.Add(frame);
}
public void Save(string path) {
var clip = new MotionClip { frames = frames.ToArray() };
File.WriteAllText(path, JsonUtility.ToJson(clip));
}
}
Sample in LateUpdate, not Update — body-tracking poses are written during the frame and finalized after Update
Capture both position and rotation per joint; pose-only or position-only data is insufficient for clean playback
60–90 Hz sampling is fine; storing every frame is cheap (~few MB per minute)
If you serialize JSON to disk on the Quest, write to Application.persistentDataPath, not Application.dataPath — only the former is writable
The replay scene's job: load a recording, drive a non-tracked skeleton GameObject from it, expose timeline scrubbing, and overlay analysis (color-coding, metric values).
Steps:
Create a new scene with no body-tracked rig — you do not want the Movement SDK driving the skeleton in this scene
Build a humanoid skeleton GameObject hierarchy (one Transform per joint, parented in body order). Easiest: instantiate the same skeleton mesh as the capture scene, but strip the BodyTracking component and add your own playback driver
Attach a MotionPlayer MonoBehaviour that loads a MotionClip, exposes a currentTime property, and writes joint poses every frame:
public class MotionPlayer : MonoBehaviour {
public MotionClip clip;
public Transform[] skeletonJoints;
public float currentTime = 0f;
void Update() {
var frame = clip.SampleAt(currentTime); // linear interp
for (int i = 0; i < skeletonJoints.Length; i++) {
skeletonJoints[i].position = frame.joints[i].pos;
skeletonJoints[i].rotation = frame.joints[i].rot;
}
}
}
Add a timeline UI: slider for scrubbing, play/pause button, speed multiplier (0.25×, 0.5×, 1×). Standard Unity UGUI works fine in world space
For analysis overlays: each frame, recompute biomechanical metrics from the joint positions and update colors on bones — e.g., red knee mesh when valgus exceeds threshold. See VR for Physical Therapy & Movement Rehabilitation for the metric set used in PT Detective
Mac development for Quest 3 has no Quest Link option — the loop is Build APK → ADB sideload → run on device:
File > Build Settings, target Android, switch platform
Click Build → produces .apk in your chosen folder
With Quest connected via USB-C and developer mode enabled: adb install -r path/to/build.apk
Launch from Library > Unknown Sources on the Quest
Iteration tip: a build_and_deploy.sh shell script that runs Unity's batch-mode build followed by adb install -r cuts the loop from ~3 minutes of clicking to one shell command.
The most common failure mode is the Core SDK / Movement SDK version mismatch described in the setup section. Diagnostic signals:
Console fills with retargeter warnings
Skeleton segment lengths visibly drift between frames
Limbs appear detached from the body or float in space
For the full diagnostic walkthrough, see Diagnosing Meta Movement SDK Retargeting Failures in Unity.
VR for Physical Therapy & Movement Rehabilitation
Diagnosing Meta Movement SDK Retargeting Failures in Unity
Unity Hand Gesture Tracking
Quest 3 Setup Guide
Sanil Desai (Spring 2026)