An Electromyography Based Shared Control Framework for Intuitive Robotic Telemanipulation

Over the years, robots have seen an increased use in a wide range of applications including exploration, maintenance, and search and rescue operations in remote or hazardous environments. Developing fully autonomous systems for unstructured environments is a challenging task while controlling them manually requires trained operators. This paper proposes an electromyography (EMG) and fiducial marker-based teleoperation framework that allows for shared task execution between the user and an autonomous control scheme. The framework offers an intuitive interface that allows the user to take control of the robot arm-hand system and perform tasks that can not be executed autonomously. The EMG subsystem decodes specific gestures performed by the user, triggering a switch between different control states of the robot or setting the desired grasp type for the robot hand. The robot arm is teleoperated through human motion captured by an appropriate vision system that employs fiducial markers. The system performance is validated in a series of five experiments, where the robotic system successfully executes completely autonomous tasks as well as tasks performed in a shared control manner in synergy with the user.