This page is a beginner-friendly guide to using Meta XR Building Blocks to quickly set up interactive scenes in Unity for VR or MR projects. These prefabs and tools simplify development by allowing you to implement buttons, sliders, and ray-based or poke interactions without writing custom scripts. This approach is perfect for prototyping interactions, especially in projects like mine (an MR Spotify map using passthrough on the Quest 3).
Saves time: Avoid reinventing the wheel with basic UI/interactions.
No heavy coding required: You can hook up events directly in the Inspector.
Compatible with passthrough, ray, and hand tracking.
Helps with rapid iteration during prototyping or in-class testing.
Install Meta XR SDK via Unity Package Manager.
Import the Meta Interaction SDK (Building Blocks) package from Meta's GitHub or Unity Asset Store.
Drag prefabs into your scene:
Use [PokeButton] for direct finger or controller touches.
Use [RayInteractor] for point-and-click from a distance
Make sure Building Block components use world-space, not screen-space.
Avoid UI being too close to the user; offset a bit for comfort.
Use the Hand Tracking Interaction Profile under OpenXR settings.
Some buttons may not register unless properly aligned to collision boxes.
Special thanks to Colby for walking me through the prefabs and helping me realize I didn’t need to build everything from scratch. Also learned a lot by reviewing class Wiki pages.
https://www.youtube.com/watch?v=paVX3Pm4Yq4.
Combine with voice commands using Meta's SDK
Trigger more complex events like animations or scene changes
Log button presses for research or classroom use