Written by: Ellie Na (2026)
In this page, I'll walk through about how to import Meta hand tracking SDK into Unity and how hand gestures are detected based on poses and shapes. It explores the available default hand gestures, which can serve as a foundation for developing custom gesture-based event triggers in future work.
Open Unity
Windows > Package Manager
Install XR Interaction Toolkit > Go Samples tab, import them in the screenshot below
Install XR Hands > Go Samples tab, import them in the screenshot below
3. Edit > Project Settings > XR Plugin Management
Install XR Plug Management
In PC tab (with small icon), select 'OpenXR', and make sure to fix all the issues in yellow warning icon
In Android tab (with small icon), select 'OpenXR', and make sure to fix all the issues in yellow warning icon
4. Project Settings > OpenXR (toggled in XR Plugin Management) > Add enabled interaction profiles and check the items in OpenXR feature groups like the screenshots below
Assets > Search 'XR Origin' > Drag and drop it to the hierarchy panel
In hierarchy panel, (mouse right click on the 'camera offset' list) Create empty, then it will make a GameObject > Rename it to 'Hand Visualizer'
Click 'Hand Visualizer' > In inspector, click 'Add component' > Find and click 'Hand Visualizer' component > Drag and drop reference like below > The remaining display-related hand tracking settings are optional
4. Test in VR Headset
XR hands install: https://docs.unity3d.com/Packages/com.unity.xr.hands@1.4/manual/project-setup/install-xrhands.html
More detailed explanation: https://youtu.be/mJ3fygb9Aw0?si=rA-jeFlygQIxBBfo