Movement Intention Detection for Non-Invasive and Intuitive Prosthetic Hand

Lay Summary 

Prosthetic hands have the potential to improve the lives of individuals who have lost their upper limbs. Yet their functioning is limited as they are not connected to the human neural system. The goal of this research program is to design and improve the interface between a prosthetic hand and the user by detecting human movement intention.
The long-term objective of the research program is to advance assistive devices including human prosthesis through the inclusion of non-invasive and affordable technologies, and by improving the detection of movement intention of the users for better control of the artificial body part. My short-term objectives (within the next 5 years) are to 1) develop a prosthetic arm sock with fusion of force pressure sensors and electromyography sensor to collect higher quality muscle activities data, 2) examine the relationship between muscle and eye signals to movement intention, 3) improve grasp detection using multiple module inputs including muscle activity sensing, inertial measurement units, and eye-tracking devices, and 4) develop a machine learning model to accurately predict a series of movement intentions during a sequential goal-directed task. Sensor fusion technologies will also be implemented throughout my research to utilize the advantages of multiple modules input.

This work will generate important tools to convert eye motion and muscle activity signals into movement intention, which is an innovative approach to movement intention detection for intuitive prosthesis control. Prosthetic devices have been greatly improved in recent years whereas man-machine interfaces for prosthesis control do not fulfill the device potential. Bridging this gap would greatly advance prosthesis systems and improve quality of life among users. My research will benefit both the large number of people who have lost limbs and the prosthesis industry (increase in jobs and research funding). I expect that the knowledge generated from this work will enable us to design a smart prosthetic hand/arm and further improve our understanding of human-machine/robot interaction, remote control and assistive device interaction.

Departments 
Computer Science
Communities 
St. John's
Locations 
Newfoundland and Labrador
Start date 
1 May 2020
End date 
30 Apr 2023