Wed-SS-2-7-5 Neural Speech Decoding for Amyotrophic Lateral Sclerosis

Debadatta Dash(The University of Texas at Austin), Paul Ferrari(The University of Texas at Austin), Angel Hernandez(Division of Neurosciences, Helen DeVos Children’s Hospital), Daragh Heitzman(MDA/ALS Center, Texas Neurology), Sara Austin(The University of Texas at Austin) and Jun Wang(University of Texas at Austin)
Abstract: Amyotrophic lateral sclerosis (ALS) is a motor neuron disease that may cause locked-in syndrome (completely paralyzed but aware). These locked-in patients can communicate with brain-computer interfaces (BCI), e.g. EEG spellers, which have a low communication rate. Recent research has progressed towards neural speech decoding paradigms that have the potential for normal communication rates. Yet, current neural decoding research is limited to typical speakers and the extent to which these studies can be translated to a target population (e.g., ALS) is still unexplored. Here, we investigated the decoding of imagined and spoken phrases from non-invasive magnetoencephalography (MEG) signals of ALS subjects using several spectral features (band-power of brainwaves: delta, theta, alpha, beta, and gamma frequencies) with seven machine learning decoders (Naive Bayes, K-nearest neighbor, decision tree, ensemble, support vector machine, linear discriminant analysis, and artificial neural network). Experimental results indicated that the decoding performance for ALS patients is lower than healthy subjects yet significantly higher than chance level. The best performances were 75% for decoding five imagined phrases and 88% for five spoken phrases from ALS patients. To our knowledge, this is the first demonstration of neural speech decoding for a speech disordered population.
Student Information

Student Events

Travel Grants