PT Detective is a Meta Quest 3 application for physical therapy movement analysis, built for Brown CSCI 1951T (Augmented Data Visualization for Research, Spring 2026). This page is the complete walkthrough: problem framing, system architecture, code-level implementation of both scenes, the six biomechanical metrics, the in-person user study (n=8) comparing VR review against 2D video review, and the lessons that came out of building it.
For higher-level application context, see VR for Physical Therapy & Movement Rehabilitation. For the technical setup of body tracking and the record-and-replay pattern in isolation, see Meta Movement SDK Body Tracking and Record-and-Replay Workflow in Unity.
Physical therapists assessing patient movement face a tooling gap. On one end, marker-based motion capture labs deliver high-accuracy joint kinematics but cost hundreds of thousands of dollars, require trained operators, and never leave the lab. On the other end, smartphone video is cheap and ubiquitous but is locked to a single camera angle, hard to measure precisely, and forces the clinician to make 3D judgments from 2D evidence.
PT Detective explores a middle ground: use a $500 Quest 3 to capture a patient's movement via inside-out body tracking, then let the clinician review that movement as a fully volumetric 3D skeleton with biomechanical metrics overlaid in real time. The clinician can rotate around the patient, freeze frames, scrub timelines, and inspect joints from whichever angle best reveals the relevant pathology.
The project's core question: does volumetric VR review actually help a reviewer detect movement anomalies more effectively than 2D video?
PT Detective is a two-scene Unity application. Separation of capture and review is deliberate — the patient performing movement and the clinician reviewing it are different roles, with different UI needs and different visualizations.
Patient scene — minimal environment, OVR camera rig with body tracking, controller-triggered recording start/stop, on-headset visual feedback to the patient. Records the user's body-tracked motion to a serialized motion clip on persistent storage.
PT Review scene — review-room environment, a non-tracked humanoid skeleton GameObject driven by playback of a recorded clip, world-space timeline UI for scrubbing, and a per-frame metrics panel with color-coded bone overlays.
The single largest blocker on this project was a silent incompatibility between the Meta Core SDK and Meta Movement SDK. With slightly mismatched versions, Unity Package Manager raises no warnings, the project builds, the APK installs, and the body-tracked rig appears to work — but segment lengths drift across frames, retargeter error counts climb in the console, and the resulting motion data is corrupt enough to make biomechanical metrics meaningless.
Confirmed-working stack:
The fix when these get out of sync is a manual downgrade in Packages/manifest.json to whatever Core SDK version the Movement SDK release notes explicitly call out as compatible. Auto-resolution by Package Manager will not pick this for you.
Configuration also matters: Meta Movement SDK works under OpenXR with the Meta Quest Support feature, not under the Meta Quest XR Plugin. The two plugins are alternatives, not complements.
Full setup steps and gotchas live on Meta Movement SDK Body Tracking and Record-and-Replay Workflow in Unity. Diagnostic signals for when retargeting silently fails live on Diagnosing Meta Movement SDK Retargeting Failures in Unity.
The patient scene is intentionally minimal. The patient is wearing the headset and performing a movement; the last thing they need is a visually busy environment.
Scene contents:
OVRCameraRig (or equivalent OpenXR rig) for head/controller tracking
A retargeted humanoid rig from the Movement SDK samples, bound to the user's body
A floor plane and ambient lighting — no avatar visible to the patient (looking at your own retargeted skeleton mid-squat is distracting and adds latency-perception load)
A RecordingManager GameObject with the recorder script and UI
A small heads-up indicator (red dot when recording, countdown when starting)
Recording flow:
Patient steps into the calibrated tracking volume
Operator triggers a 3-second countdown via controller
Recording starts; patient performs five bodyweight squats
Recording stops on second controller trigger
Clip is written to disk and the scene auto-transitions or shows a "saved" confirmation
The review scene is where the project's actual value proposition lives. Everything in the scene exists to help the clinician find movement anomalies faster.
Scene contents:
A neutral review room (so the skeleton stays the visual focus)
A humanoid skeleton GameObject — the same hierarchy as the patient scene, but with the BodyTracking component stripped and a MotionPlayer attached instead
World-space UI panels: a timeline scrubber, a play/pause/speed control, a per-metric readout, and a clip selector
Controller ray interaction for poking UI buttons and dragging the timeline scrubber
A MetricEvaluator that reads the current frame's joint positions and emits per-metric scores
The metrics are drawn from two sources: Bishop et al. 2017 for asymmetry indexing, and Nae et al. 2017 for squat-specific assessment criteria. All six are computable from the joint position stream, no force plates or EMG required.
The study compared VR skeleton review (PT Detective) against 2D video review of the same recordings, within-subjects, with order counterbalanced.
n = 8 participants from the CSCI 1951T cohort
Design: within-subjects, counterbalanced crossover. Half started with VR and crossed over to 2D; the other half started with 2D and crossed over to VR.
Stimuli: pre-recorded squat clips containing simulated anomalies for each of the six metrics. Anomalies were embedded by selectively recording squats with deliberately exaggerated faults (e.g., overt knee valgus, exaggerated trunk lean), confirmed against the metric thresholds.
Task per condition: for each clip, the participant identified which of the six anomalies was present and rated their confidence.
Measurements: per-metric detection accuracy, overall preference, perceived task success, frustration (Likert), free-text qualitative comments.
Operator protocol: a fixed script for introduction, calibration, condition order, and debrief, to minimize operator-induced variance.
The full participant instructions, operator protocol, and the Google Form used for response collection are in Spring 2026 course timeline.
Headline result. All 8 participants preferred VR over 2D video for squat review (8/8 = 100%). Under a two-sided exact binomial test against the null preference of 0.5, p ≈ 0.008.
Per-metric detection accuracy. Wilcoxon signed-rank tests on per-participant detection accuracy by metric:
Where VR won (qualitatively and in effect size). Asymmetry detection, perceived task success, and reduced frustration. Asymmetry in particular benefits from the rotational freedom of VR — a clinician can orbit around the patient and put left and right limbs side-by-side in the same field of view, which a single fixed 2D camera cannot do.
Where 2D won. Forward trunk lean. The clearest sagittal-view frame from a tripod-mounted phone turns out to be a strong cue for trunk angle, and the VR skeleton's rotational freedom paradoxically hurts this metric — reviewers don't reliably stop at the sagittal angle that makes lean obvious. Likely design fix: a fixed-sagittal "shadow" silhouette pinned to a wall in the review scene.
Statistical caveat. With n=8, the Wilcoxon signed-rank test's null distribution is discrete and coarse. The minimum achievable two-sided p-value is bounded well above 0.05 for several W values, regardless of effect size. Most per-metric Wilcoxon comparisons did not reach significance even where rank-biserial effect sizes were large. This is a structural feature of small-n nonparametric testing, not evidence of weak effects in the data — and is communicated explicitly in the analysis writeup.
On the SDK. Treat Core SDK ↔ Movement SDK version compatibility as a first-class project concern, not a setup detail. Pin versions in manifest.json from the start. Watch the retargeter error count and segment-length stability as the canary signals before wasting hours debugging downstream metric noise.
On the visualization. Rotational freedom is not unconditionally a win. The metric-by-metric results are an existence proof that some movement assessments benefit from arbitrary viewing angles (asymmetry) and others benefit from a constrained "best" angle (forward lean). Future iterations should consider mixed-mode review: free 3D rotation with optional pinned 2D-projected views for metrics where a known-best angle exists.
On the study. Counterbalanced crossover with simulated anomalies is a tractable design at classroom scale, but n=8 is structurally too small for nonparametric per-metric significance. Future versions of this study should either (a) recruit n≥16 to escape the discrete-distribution floor, (b) collapse across metrics for an aggregate accuracy comparison, or (c) shift to validated effect-size reporting and de-emphasize p-values entirely.
On engineering velocity. A build_and_deploy.sh shell script wrapping Unity batch-mode build + adb install -r reduced the dev loop from minutes-of-clicking to a single command. On Mac without Quest Link, build-deploy automation is the single highest-leverage engineering investment for a project of this kind.
Unity project repository (Patient scene + PT Review scene, all scripts above, prefabs)
Build scripts (build_and_deploy.sh)
Recorded study stimuli (anomaly-embedded squat clips)
Python analysis notebook (Wilcoxon signed-rank, rank-biserial effect sizes, exact binomial preference test, matplotlib plots)
PowerPoint deck (deep blue/teal palette, used for in-class presentation)
Participant instructions document
Operator protocol document
Google Form specification (for participant response collection)
Wiki contributions:
VR for Physical Therapy & Movement Rehabilitation (parent application page)
Meta Movement SDK Body Tracking and Record-and-Replay Workflow in Unity (technical tutorial)
Diagnosing Meta Movement SDK Retargeting Failures in Unity (troubleshooting page)
This walkthrough
Sagittal-view widget — pinned silhouette projection of the skeleton onto a wall plane, addressing the forward-lean detection gap surfaced by the user study.
Validation against gold-standard mocap. Quest 3 inside-out body tracking has not been independently validated against marker-based reference for the metrics used here. A small validation study (Quest + reference mocap simultaneously) would establish the absolute accuracy floor.
Beyond the squat. Lunges, single-leg balance, gait, and shoulder ROM are natural next movements; the architecture (record/replay/metric overlay) is movement-agnostic.
Larger study. n≥16 to escape the small-sample test-power floor; recruitment from a clinical PT population rather than CS students for ecological validity.
Anomaly detection from real (not simulated) faults. Replace operator-generated bad squats with patient recordings exhibiting genuine compensations.
VR for Physical Therapy & Movement Rehabilitation (parent)
Meta Movement SDK Body Tracking and Record-and-Replay Workflow in Unity
Diagnosing Meta Movement SDK Retargeting Failures in Unity
VR in Medicine
Sanil Desai (Spring 2026)