Electromyography and Potential Fields
Based Shared Control Framework for
Robotic Telemanipulation

Numerous robotics applications have utilized telemanipulation frameworks to perform tasks in remote or dangerous environments. During execution of telemanipulation tasks using robotic systems, human operators are in charge of controlling these systems effectively. Telemanipulation methods necessitate skilled and knowledgeable operators as well as the use of complex interfaces. Semi-autonomous systems, on the other hand, blend user intentions with autonomous control modules to provide an alternative to complicated and non-intuitive systems. This paper proposes a shared control framework for robotic telemanipulation, based on electromyography (EMG) and potential fields that can be used with a dexterous robot arm-hand system for the execution of complex (e.g., cube stacking) tasks. The user motion is decoded from the myoelectric activations using Random Forests (RF) based regression models. Electromyography shows promise for the development of muscle-machine interfaces (MuMI), however the signals are often noisy and hard to decode into motion. To assist the user during task executions, potential fields are used to avoid obstacles and guide the end-effector towards desired objects, thus reducing the cognitive load on the user and the need for accurate predictions. The framework performance is experimentally validated in a real-time cube stacking task.