Find Your Course
Liverpool Hope Logo

School of Health Sciences

Examining the spatiotemporal disruption to gaze when using a myoelectric prosthetic hand

Researchers: Mr Johnny Parr (PhD), Gavin Buckingham, Dr Greg Wood, Sam Vine, Pan Dimitriou and Sarah Day

Prosthetic Hand Research Prosthetic hand devices are often poorly utilised and frequently rejected. High rejection rates have been attributed to the high cognitive burden that these devices impose on the user. Here, we investigated the nature of this burden by simultaneously examining gaze behaviour and EEG coherence between the verbal-analytical (T7) and motor planning (Fz) regions in able-bodied participants using a prosthetic hand simulator.
 
Twenty participants were required to perform 30 trials of the “lifting a heavy object” task from the Southampton Hand Assessment Procedure (SHAP) using their anatomical hand and a myoelectric prosthesis simulator. During performance, gaze behaviour was recorded to determine the spatial (target locking strategies) and temporal (gaze-shifting) characteristics of visual attention. EEG was also recorded to compute T7-Fz coherence for the high-alpha (10-12Hz) bandwidth to determine the extent of conscious movement control during the reaching and grasping phases of the task. Participants were significantly slower, used more hand-focused gaze and exhibited significant delays in the time to disengage vision from hand movements when using the prosthetic simulator. These disruptions were multiplied during the manipulation of the jar.
 
The dependence on vision during the manipulation phase coincided with increased T7-Fz coherence, suggesting conscious movement control during this movement phase. Findings suggest a link between increased visual attention and verbal-analytical processing is related to the cognitive burden associated with prosthetic hand rejection. These metrics can now be used to test the efficacy of rehabilitation strategies and may inform hand prosthesis design. 
 
 

Contact Information

For more information on this project, please contact Johnny at parrj@hope.ac.uk.