Scalable, Fast, Highly-Accurate Human-to-Robot Skill Transfer with Augmented Reality and Wearable, Human-Machine Interfaces

Humans are capable of performing intricate and complex tasks, enabling seamless interaction with their surroundings. Therefore, capturing the human demonstrated skills and transferring these skills to robots is beneficial when engaging with and executing tasks in a human-oriented world. However, these demonstrations are not always transferable as kinematic differences can prevent robots from properly replicating the human demonstrated strategies. In this work, we propose a human-to-robot skill transfer system, where the human demonstrator wears and directly controls the robots end-effectors with appropriate interfaces. In order to ensure high quality human demonstrations an Augmented Reality (AR) scheme providing visualizations of the robot’s dexterous workspaces is created so as to collect effective grasping and manipulation data. This AR interface also ensures that the collected demonstrations are within the capabilities of the robot. The hardware-agnostic nature and efficiency of the proposed interfaces and skill transfer methodology are demonstrated through the execution of complex tasks that require increased dexterity, like microtome operation, writing and drawing.