Project Details
Projekt Print View

DEEP-HAND: deep sensing + deep learning for myocontrol of the upper limb

Subject Area Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Medical Physics, Biomedical Technology
Term from 2015 to 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 272314643
 
This project proposal aims to deepen and extend our work and findings of Tact-Hand, where we have introduced tactile sensors and new machine learning methods to advance the state of the art in prosthetic hand control compared to surface electromyography (sEMG). Tactile sensors were used in a two-fold fashion: providing high-dimensional input data of muscle deformations for human intent detection, as well as sensorizing the prosthetic hand itself, thus providing valuable input for autonomous grasp control. However, we have also found two fundamental hurdles: (a) tactile sensing, like sEMG, is limited to surface activity and should be augmented with sensors detecting deep muscle activity; and (b) the interaction between the human and the machine is as important as the machine learning methods, and needs to be further explored and strengthened.To this aim we hereby propose to advance Tact-Hand by (i) developing novel sensors focusing on deep muscle activity, (ii) extending the psychological investigation of human-machine interaction in myocontrol of arms and hands; and (iii) to fit this investigation with advanced machine learning methods, mainly based upon deep learning. As it happened in Tact-Hand, the technological progress will be continuously evaluated in real-life conditions on both amputees and able-bodied persons, using the experimental protocols and setups we have developed in the previous project, as well as developing new ones.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung