Multimodal explainable ai predicts upcoming speech behavior in adults who stutter

HIGHLIGHTS

  • who: Edward Golob from the University of Eastern Finland, Finland have published the article: Multimodal explainable AI predicts upcoming speech behavior in adults who stutter, in the Journal: (JOURNAL)
  • what: The aim of this study was to detect EEG and facial muscle activity signals that precede vocalizations and can jointly predict fluent vs. stuttered speech outcomes. To evaluate the method, the authors explore the dynamics of differing EEG states during speech preparation and facial muscle movements of AWS. The authors show that the proposed algorithm can learn to predict upcoming fluent vs. stuttered speech from . . .

     

    Logo ScioWire Beta black

    If you want to have access to all the content you need to log in!

    Thanks :)

    If you don't have an account, you can create one here.

     

Scroll to Top

Add A Knowledge Base Question !

+ = Verify Human or Spambot ?