Sensorimotor Prosthetics

We work with stakeholders to create fit-for-purpose prostheses.

 Acquiring a new skill, for example, learning to use chopsticks, requires accurate motor commands to be sent from the brain to the hand, and reliable sensory feedback from the hand to the brain. Over time and with training, the brain learns to handle this two-way communication flexibly and efficiently. Inspired by this sensorimotor interplay, our research is guided by a conviction that progress in prosthetic limb control is best achieved through a strong synergy of motor learning and sensory feedback.

We, therefore, study the interaction of neural and behavioural processes that control the hand movements to ultimately innovate prosthetic control solutions that users would find fit for purpose.

Video: Multi-Grip Classification-Based Prosthesis Control With Two EMG-IMU Sensors
Multi-Grip Classification-Based Prosthesis Control With Two EMG-IMU Sensors

Specifically, we are developing

  • novel methods and technologies enable the utilisation of the flexibility of the brain in learning new skills for closed-loop prosthesis control;
  • efficient artificial intelligence algorithms for processing of multi-modal data collected with hybrid sensors;
  • effective systems and stimulation paradigms to restore sensory feedback in prosthetic control;
  • a data-driven care model that enhances the experience of receiving a prosthesis.

Our work on sensorimotor prosthetics builds on a long-term interaction between Dr Nazarpour, Prof Vijayakumar and Dr Roche.

Active Topics

Key Collaborators

Matthew Dyson Newcastle University
Sarah Day University of Strathclyde