Written by: Ellie Na (2026)
This page covers how to make UI buttons in VR physically pressable with your fingers, no controllers needed. The goal is to reach out with your index finger, poke a button, and have it respond. The first attempt used Unity's built-in XR Poke Interactor (from XR Interaction Toolkit). It failed. This page documents both what went wrong and the simpler custom approach that actually worked.
Hand gesture tracking must already be set up
(See the Unity Hand Gesture Tracking Setup page for setup instructions)
'XR Hands' package installed
Target Device: Meta Quest 3
The XR Interaction Toolkit includes an XRPokeInteractor component designed exactly for this use case. So what I did was I added XR Poke Interactor to the hand GameObject - added XR Simple Interactable to the button - added Box Collider to the button - wired OnClick event to the desired method.
Problem 1. m_AttachTransform was empty - The Poke Interactor needs to know where the fingertip is in space. This is set via AttachTransform. Without it, the interactor has no reference point so doesn't know what's "poking."
Problem 2. "Finger shape type Unspecified is invalid" - This error appeared because m_RequirePokeFilter was enabled by default, but the filter had no valid finger shape configured. XR Hands joint data wasn't being passed through correctly to the interactor's filter system. Attempting to fix m_AttachTransform via a helper script (SimplePokeSetup.cs) using reflection didn't resolve the finger shape detection error. The XR Interaction Toolkit's poke system requires deeper integration with the XR Hands joint pipeline than this project's setup supported.
-> XRPokeInteractor works well when your XR Hands joint hierarchy is fully wired up through the XR Interaction Toolkit's hand tracking pipeline. If you're using XR Hands for gesture detection only like custom gesture scripts, the poke interactor's finger shape filter won't have the data it needs and will throw errors.
Instead of relying on XR Interaction Toolkit's poke system, the solution was to use basic Unity physics: attach a small trigger collider to the index fingertip, and detect when it enters a button's collider. This requires no XR Interaction Toolkit dependency and works reliably with XR Hands joint tracking.
What is Box Collider?
A Box Collider is an invisible rectangular boundary that Unity uses to detect physical overlap between objects. It doesn't have to match the visual shape of the button exactly, it just defines the touchable zone. For VR buttons, we need a Box Collider because Unity's default Button component only responds to screen-based input (mouse clicks, touch). Without a collider, the fingertip sphere from HandColliderSetup has nothing to collide with, the button is visually there, but physically invisible.
Scripts you will need
HandColliderSetup.cs that adds a small sphere collider to the index fingertip and tracks its position every frame using XR Hands joint data
SimpleTouchButton.cs that detects when the fingertip collider enters the button's trigger zone and fires a UnityEvent; replaces Unity's default Button component which only responds to mouse/touch input, not physical collisions
How to set up
Select your button GameObject (e.g. Button (Save)) - Add Component 'Box Collider' - Make sure Set Is Trigger is checked - Resize to match the button's visible area
Add Component 'Simple Touch Button' - Click + In the On Button Pressed UnityEvent field - Drag the target GameObject and select the method to call (e.g. AnnotationManager.SaveCurrentAnnotation)
Visual feedback helps confirm that the system recognized their action, reducing uncertainty and making interactions feel more responsive and reliable. To give visual feedback when a button is poked, use Unity's built-in Color Tint transition on the Button component