Project 1 In-Class Activity

The goal of this activity is to expose students to a diverse array of interactions users are able to facilitate with robots in a virtual environment. In experimenting with various forms of feedback to and from the robotic system, students will reflect on their experience and evaluate the advantages and disadvantage of each form while identifying the most appropriate channel of collaboration for the given task and role of the robot.

Pre-Activity Set Up

  • Start your paperspace machine

  • On paperspace, download the zip file here

  • On your local computer, download Sidequest (if you haven't already)

  • On your local computer, download the apk file here

Haptic Control / Hand-guiding mode

Haptic control was implemented to allow the user to move the robot by hand. This example is designed to simulate manual, touch-based interactions between the human and robot in a virtual setting.

Here are the steps to set it up:

  1. Start your paperspace machine

  2. On paperspace, download the zip file here

  3. Open the zip file and extract all the files

  4. Navigate to Yumi_VR_BioIK -> yumi_VR_demo.exe

  5. Using your Quest, open Virtual Desktop and connect your machine

  6. Open yumi_VR_demo.exe on your Quest. — You should now see SteamVR and a Unity app open

Movement / Controller Guide

  • you have to stand up and physically move around to navigate the space

  • look/turn around to find the robotic arms

  • try to reach for the blue and purple ends of the arms and grab it

  • once you hold the colored ends, guide the robot to create some motion (try shaking hands)

Keyboard/Controller-based Interface

To set up the keyboard and controller interactions:

  1. Download the apk file here

If you haven't enabled developer's mode on your Quest, follow these instructions to do so:

  1. Open the Oculus app and tap Settings (bottom-right)

  2. Select your connected Quest from the device list and connect to it

  3. Tap More Settings which appears below your Quest in the device list

  4. Tap Developer Mode

  5. Tap the switch to enable developer mode

  6. Exit Settings on the app & reboot your Quest using the right-side power button

If you haven't downloaded Sidequest yet, you can install it here

To set up for sideloading using the Sidequest app:

  1. Open the Sidequest app on your computer

  2. Connect the Quest to your computer via USB cable.

  3. Put on the Quest headset – you should see a window open asking you to Allow USB debugging.

  4. Check the box labelled Always allow from this computer and click OK

  5. Click the icon showing an arrow inside a box at the top of the Sidequest window.

  6. Choose the apk you downloaded from the window that opens

To launch the sideloaded .apk file on your Quest:

  1. From your Quest headset, go to Library -> Unknown Sources

  2. Click the app you want to launch from the list on the right

Movement / Controller Guide

  • your left joystick controls your position

  • your right joystick controls your rotation

  • to select an input field just click on it and the keyboard will appear (while there isn't a clear indicator that you've selected the input field, you can try inputting a number and check if you've selected the correct input field by trial and error)

  • to grab the ball hold the side grip (most likely where your middle finger is) on the controller that is interacting with the object

  • the rays cast from the controllers change controller when they land on an object that is interactable

  • once you're holding the ball, you can use the joysticks to move the ball and extend it without moving your location

  • to interact with UI elements such as the button or input fields, press the back button (most likely where your index finger is) on the controller that is interacting with the object

  • you can always click the reset button in case anything goes wrong

Keyboard Interaction Guide

  • try inputting positions for the target object and adding them to the drop down list such that they form a trajectory for the robotic arm to follow

  • check the option to show all points to visualize the positions of the target object

Controller-based Interaction Guide

  • try hold the ball with out the ray caster extensions and moving super close to robot (an arm-length away would be ideal), hold your arms out and motion the sphere as the robot tracks the target object

  • try using the rays to extend the ball and move it around the robot

  • add the positions of the target object to the drop down list so that you keep track of the trajectory of your movement