By: Aarav Kumar, 2025
Interacting with the virtual world starts with understanding how to capture inputs from controllers. Whether you're building a game, an interactive experience, or a utility app, recognizing button presses, trigger pulls, and joystick movements is essential for responsive and immersive interactions.
This tutorial will break down how to get controller inputs from Meta Quest in Unity. You'll learn how to access button states, track thumbstick movement, and integrate these inputs into your project. This tutorial will give you a foundation to make your application react to user interactions from the controllers.
Go to the following links and follow the instructions to get a base Unity project setup for XR interactions with MetaXR
Seting up Unity for XR Development: https://developers.meta.com/horizon/documentation/unity/unity-project-setup/
Setting up Your Device for XR Development: https://developers.meta.com/horizon/documentation/unity/unity-env-device-setup/
Make sure to "Fix All" outstanding issues for the Android platform (by navigating to Meta > Tools > Project Setup Tool) before you continue.
Delete any camera object (if it appears in youE scene) for now, as this will be installed later in the tutorial.
Naviagte to Edit > Project Settings. Then, under Player > Other Settings, look for Active Input Handling, and set it to "Input Manager (Old)" — you can also try experimenting with the new input manager if you wish, but will probably need to adjust further settings and deviate from the tutorial for this.
Meta Quest controllers have multiple input types, including analog thumbsticks, buttons, triggers, and grip buttons. To manage these inputs efficiently, the OVRInput API provides a unified system for querying both virtual and raw controller states.
To use OVRInput, you need to have an OVRManager component in your scene and call OVRInput.Update() and OVRInput.FixedUpdate() at the beginning of your script’s Update() and FixedUpdate() methods, respectively. This ensures that controller tracking is handled correctly, allowing access to position, orientation, and button states.
OVRInput provides three types of mappings:
Virtual Mapping (Combined Controller) – Treats both controllers as a single gamepad-like input, useful for cross-platform compatibility.
Virtual Mapping (Individual Controllers) – Allows direct access to specific controllers, such as OVRInput.Controller.LTouch (left) or OVRInput.Controller.RTouch (right).
Raw Mapping – Provides low-level access to specific buttons without abstraction.
Virtual Mapping (Combined Controller)
Virtual Mapping (Individual Controller)
Raw Mapping
You can create your own script to implement any of these types of controller mappings for your project, and follow the documentation (link, link2) to design your script to map to different actions based on the controller's individual or combined inputs. Alternatively, you can use the Controller Mapping building block provided by Meta on Unity to easily map controller inputs to specific actions in your scene, requiring you to write minimal code and simplifying the mapping process.
This tutorial will focus on the building block method, but feel free to try writing your own mapping script if you feel comfortable / want to implement very precise forms of mapping.
From the top menu, navigate to Meta > Tools > Building Blocks.
Search for the "Controller Buttons Mapper" building block, and click the "add" button to add it to your scene (or drag and drop it into your scene directly). Any dependencies of Controller Buttons Mapper (in this case the Camera Rig) will automatically be installed when you add it to your scene, so don't be surprised if you see extra elements show up.
Also add the Controller Tracking and Passthrough building blocks (the latter is need if you want passthrough interactions, but don't add it if you want to be purely in VR).
As a sample interactive element, we can now create a simple cube to interact with using our controller buttons. In your scene, look for the + icon to add an asset, and navigate to 3D Object > Cube to make a cube in your scene.
Then, in your scene view, look for the Move Tool (should be an icon with an arrow in each direction, right under the hand icon). Select the move tool and then select the cube in your scene view to move its position along the 3 axes.
Move the cube to be in front of the Camera Rig element so that it is visible to the user when they launch the app -- you can check the direction of the Camera Rig by selecting it in the hierarchy.
You can also position, scale, and rotate the cube directly from the inspector window on the right. To make the cube smaller, for example, I changed each of its X, Y, and Z scale fields to 0.1 (rather than 1, which is the default)
Creating a cube in the hierarchy
Move tool
Enitre Unity Window -- Scene Hierarchy, Scene View, and Inspector Panels
Following this, click on the Controller Buttons Mapper element, and scroll down on the Inspector Window till you see the Controller Buttons Mapper (Script) panel. Here, you can now add Button Click Actions to map button clicks on your controller to trigger certain actions in your scene.
Add a Button Click Action by clicking the + button under "List is Empty"
We'll create an example interaction where clicking on the "A" button on your controller will hide the cube. To do this, we'll rename the title of the action (something like "Hide Cube"), set the Button to be "One", and set the Button Mode to be "On Button Down".
Button "One" corresponds to the A button (as the building block uses combined controller virtual mapping -- refer to the diagram for this above)
To make the button click hide the cube, we'll then add a Callback (from the + button under Callback)
Drag the cube object from the hierarchy to the first field (that says None (Object) by default). Then, click on the panel that says "No Function" and navigate to GameObject > SetActive (bool). Make sure the resulting checkbox is unchecked, as we want the cube to NOT be active when we click the A button (i.e. it should hide)
We can then add more Button Actions as we wish, such as clicking the B button (Button Two) to show the cube, as shown on the left.
We are now ready to test out our project! Navigate to File > Build Profiles, and go to the Android tab (if it is not already selected). Make sure your Quest is connected to your laptop (via USB-C) and set the Run Device to be your Quest Device (it should show up in the list automatically). Then click Build and Run, and when the build completes, your project should be visible and interactable in Unity!
If you want to see further resources on how to use controller mapping in Unity, check out: