Events Calendar

PhD Final Exam – Henrique Cunha Dantas

Data Driven Approaches For Decoding Volitional Movement Intent From Bioelectrical Signals

There are nearly two million limb amputees living in the United States of America. Loss of limbs results in profound changes in one's life. However, the underlying neural circuitry and much of the ability to sense and control movements of their missing limb is retained even after limb loss. This means that amputee has the ability to control artificial limbs in a manner similar to how the limb was controlled before the loss. The goal of this research is to develop technologies for creating prosthetics arms that behave like the natural limb. Movement intent decoders allow amputees to control prostheses by interpreting motor-related bioelectrical signals, restoring their ability to perform day-to-day tasks. Such systems have to overcome a number of challenges before they can become practical. These challenges include the recursive nature of the human decision-making process, the limited amount of data typically available for training and the time-varying properties of the nervous system. In this dissertation, we apply data-driven techniques to develop precise movement intent decoders and prosthetic controllers. Specifically, this work makes three major contributions to the field: 1- We developed movement intent decoders based on different neural network architectures including multilayer perceptron networks, convolutional neural networks, and long short-term memory neural networks. These systems were trained with a dataset aggregation (DAgger) approach, an imitation learning algorithm. DAgger augments the training set based on the decoder outputs in the training stage, mitigating possible mistakes that the decoders could make. The decoders were validated in offline analyses using data from two amputee arm subjects. The results demonstrated an improvement of up to 60% in the normalized mean-square decoding error over state-of-the-art decoders. 2- Movement intent decoders can be of different types, including proportional controllers, classification-based decoders or goal-based estimators. Each of these types of decoders come with their own set of advantages and weaknesses. We developed a shared-controller framework able to combine multiple decoders to control a prosthetic limb taking advantage of the individual strengths of the component decoders. The shared-controller framework was validated using two shared controller-systems. The first one combined a Kalman Filter (KF)-based decoder and a classifier-based decoder. The second system consisted of a KF-based decoder and a controller with knowledge of the final goal with a substantial amount of uncertainty. The controllers were validated using three amputees and three intact-arm subjects. The shared-controller systems outperformed the component decoders in most of the used metrics. An example of this is the subjects were able to stay in the intended position 70% longer using the KF-based decoder combined with a classifier-based decoder when compared with the KF-based decoder alone and 283% longer when compared with the classifier-based decoder alone. 3- Although the human body is a time-varying system, the decoders parameters are kept unchanged after training in many prosthesis systems. This causes a performance deterioration for the decoders over time. We developed an online-learning algorithm that is able to adapt itself during the post-training phase. The performance of such decoders were validated using data from two amputee subjects with transradial amputation. After 5 months of the initial training, the decoder with adaptation exhibited a 27% lower normalized mean-squared decoding error when compared with the same decoder without adaptation. In summary, the contributions of this research resulted in better training algorithms creating more accurate volitional movement intent decoders than previously possible, shared prosthesis controllers that combine multiple decoders in ways that perform better than the component decoders, and an online learning algorithm that enables the decoders to perform significantly better in the long term than current decoder realizations. Together, these contributions have brought us closer to the goal of creating limb prostheses that work and feel like the real limb.

Major Advisor: V John Mathews
Committee: Alan Fern
Committee: Fuxin Li
Committee: Xiao Fu
GCR: Julie Tucker

Tuesday, May 28 at 9:00am to 11:00am

Kelley Engineering Center, 1007
110 SW Park Terrace, Corvallis, OR 97331

Event Type

Lecture or Presentation

Event Topic


Electrical Engineering and Computer Science
Contact Name

Calvin Hughes

Contact Email

Google Calendar iCal Outlook

Recent Activity