VR in Human-Robot Collaboration
by Jennifer Wang (2022)
by Jennifer Wang (2022)
As artificial intelligence (AI) and machine learning (ML) evolve, we expect robots to execute increasingly complex tasks with complete autonomy to sense the environment, respond to it, and execute actions within it. Robots are common in industrial and commercial settings – for complex assembly picking and packing, last-mile delivery, and more. A collaboration between human and robot can implicate many advantages for such complex processes, especially as increased flexibility and adaptability become a key feature of production systems in manufacturing.
The use of virtual reality (VR) systems has the potential to simulate cooperative processes in advance and to include workers and their individual behavior into the simulation. Typical industrial production applications of VR in HRC span from manufacturing process simulation, which are able to provide real time enhanced information used for inspection or with focus on training, to collaborative factory planning during product design when analyzing and evaluating changes prior to implementation.
Existent solutions can roughly be broken down into 4 categories:
The operator support considers the communication and guidance between two resources of humans and robots.
The simulation category explores solutions upon the utilization of simulation software to enhance the users' understandability of the working ambient.
The instruction section details how the exploitation of virtual and augmented environment assists users in teaching the robot a hierarchy of tasks.
Finally, the teleoperation section aims to examine the existing solutions on how these techniques enable the operator to operate and manipulate the robot remotely.
An overarching goal in using VR for HRC is to provide straightforward, bidirectional communication between human and robot teammates. The human is provided information to more clearly understand the robot’s intent and perception capabilities, while the robot is provided information about the human that enables it to build a model. A robot might want to communicate an overall plan, a goal location, or a general intent so that the human collaborator does not duplicate efforts, alter the environment, or put themselves in danger.
One such example is Bolano et al. where the authors used 3D cameras for monitoring the robot, and the GPU-Voxel library to create a point cloud from 3D models to predict robot collision. In this study, audio instruction was used to inform the user regarding the robot's next move.
The use of VR simulations makes it possible to reduce physical and mental barriers between human and robot. By allowing the user to immerse into the simulation, VR systems create a realistic image of HRC processes that provide an inexpensive and safe method for manufacturers to conduct experiments and train their workers.
De Giorgio et al. demonstrated these benefits in a study where participants in a VR simulation can program the robot by teaching the pose targets and walk around to explore robot movements. In the direct interaction between a human and robot, having the working environment of the robot and the robot itself displayed in a dynamic, 3D virtual environment helped the operator understand how to manage the correspondence of different degrees of freedom when controlling a robotic arm with a human arm.
In developing an intuitive user interface, the use of VR under the HRC production scheme enables human operators to provide virtual instructions and quality demonstrations to teach the robots a hierarchy of tasks.
To establish a bilateral communication channel for HRC, Kousi et al. proposed an AR-based framework that aids the operator to control the robot in case of unexpected errors and to reprogram the production environment, provide information between robot and operator tasks, and demonstrate real-time execution status feedback. AR applications provided the human operator the mechanisms that allow them to directly instruct the robot in an easy and fast way without any expertise in robotics. It also allowed them to become acutely aware of the limitations faced by the robots and provide feedback and demonstrations accordingly.
https://www.sciencedirect.com/science/article/pii/S2212827120314815#bib0026
https://dl.acm.org/doi/10.1145/2927929.2927948
https://www.sciencedirect.com/science/article/pii/S2351978918300234
https://ieeexplore.ieee.org/document/8172387
https://robotics.mit.edu/teleoperating-robots-virtual-reality
http://www.cairo-lab.com/papers/survey-ar-hrc.pdf
https://www.diva-portal.org/smash/get/diva2:1458194/FULLTEXT02
https://www.sciencedirect.com/science/article/pii/S1526612519303330