On Semi-Autonomous, Intuitive, Lightmyography Based Control of Humanlike Robotic and Prosthetic Hands Utilizing Video and IMU Data
Humanlike robotic hands such as prosthetic hands become more advanced as technology develops, giving us more lightweight, sophisticated solutions with multiple degrees of freedom. Alongside the hardware improvements, control systems and human machine interfaces are also important areas of research to ensure that the operation of robotic hands is intuitive and easy to master. In particular amputees, are frequently disappointed with the difficulty in controlling their prostheses, which can lead to prostheses rejection. One method that has been explored to reduce the effort and cognitive load on the user is to implement semi-autonomy via appropriate control schemes. In this paper, a semi-autonomous control framework is proposed employing lightmyography based decoding of grasping motions. The proposed framework makes use of video and IMU data so as to reduce the number of possible grasps (grasp affordances) based on the object detected and the hand orientation. The efficiency of the proposed framework has been experimentally validated in comparison to a manual control framework. Using the semi-autonomous framework, misclassifications decreased, leading to 17/20 successful reach to grasps motions executed compared to 7/20 for the manual control case. The automatically positioned thumb functionality has also robustified grasping, allowing certain objects to be more dexterously interacted with.