Overview

This project aims to explore interaction in VR space in two ways: with controllers support for finger gestures via third party joints simulation API, and without-controller interactions via Quest2 native hand tracking. Since controllers sometimes make actions less intuitive to trigger, is it better to interact without them? Or a more detailed control support with finger will make things easier?

The project will also look into multi-user support based on these two kinds of interaction.

Tools to look into

A-Frame API for VR multi-user VS Web API for VR multi-user

Joints simulation API which supports finger gestures

Native hand-tracking with Quest2

Schedule

In-class activity

Split into two groups of pairs participating in the same activity in VR: passing a ball, lifting up a log, and exploring a dataset. One group uses hand tracking native from Quest 2 and the other group uses joints simulation API.

Deliverables & Wiki contributions

- A page for A-Frame joints simulation API via controller

- A page for hand tracking with Quest2 headset

- Update A-Frame multi-user page

- Data format page for glTF files