Hand-tracking In-Class Activity

Introduction

The advent of immersive technology has brought about a significant change in the way we interact with digital content. With the Oculus Quest 2, users have the ability to navigate and manipulate virtual environments using both traditional controllers and advanced hand tracking technology.

The traditional controller-based interface uses handheld devices that the headset tracks in real-time, translating their position and orientation into the virtual environment. This interface is familiar to most users and allows for a high degree of precision, making it ideal for complex games and applications.

On the other hand, hand tracking technology represents a newer and more natural form of interaction. Instead of using physical controllers, the headset uses built-in cameras to track the position and movement of the user's hands and fingers. This not only allows for more intuitive interaction but also offers an enhanced sense of immersion, as users can see and use their hands directly within the VR space.

In this in-class activity, we will experience both and compare them while interacting with objects, climbing a ladder,  simulating the physics of the world.

Instruction

Visit https://hand-tracking-scene.glitch.me/ in Oculus Browser . Tap "VR" on the bottom right to start interacting with the scene!

The demo support both controller and hand-tracking, make sure the headset is using the control you want before you get onto the website.


Before reading the instructions below, I strongly recommend try playing in the scene first with hand-tracking, then compare with the instruction below to see what you didn't figure out; after figuring out all the interactable in the scene, switch to controller and go over againl.

* Multi-user room is not supported for now, since the scene does not load correctly.


More Demo on Hand Mesh and Physics

If you want to experience more with hand-tracking mode, here's two simple demo on hand representation and physics interaction: