Leveraging Enhanced Virtual Reality Methods and Environments for Efficient, Intuitive, and Immersive Teleoperation of Robots

Over the last decade, many studies have focused on Virtual Reality (VR) frameworks for remotely controlling robotic systems. Although VR systems have been used to teleoperate robots in simple scenarios, their effectiveness in terms of accuracy, speed, and usability has not been rigorously evaluated for complex tasks that require accurate trajectories. In this work, an Enhanced Virtual Reality (EVR) framework for robotic teleoperation is evaluated to assess if it can be efficiently used in complex tasks that require accurate control of the robotic end-effector. The environment and the employed robot are captured using RGB-D cameras, while the remote user controls the motion of the robot with VR controllers. The captured data are transmitted and reconstructed in 3D so as to allow the remote user to monitor the task execution progress in real time, using a VR headset. The EVR system is compared with two other interface alternatives: i) teleoperation in pure VR (the model of the robot is rendered with respect to its real joint states), and ii) teleoperation in EVR (the model of the robot is superimposed on the real robot).

 

Supplementary Resources

Document Link
Frame Validation Procedure and Point Cloud Artifacts Document