Neural Correlates of Unimodal and Multimodal Speech Perception in Cochlear Implant Users and Normal-hearing Listeners
Author: Hannah E. Shatzer
Publisher:
Published: 2020
Total Pages:
ISBN-13:
DOWNLOAD EBOOKSpoken word recognition often involves the integration of both auditory and visual speech cues. The addition of visual cues is particularly useful for individuals with hearing loss and cochlear implants (CIs), as the auditory signal they perceive is degraded compared to individuals with normal hearing (NH). CI users generally benefit more from visual cues than NH perceivers; however, the underlying neural mechanisms affording them this benefit are not well-understood. The current study sought to identify the neural mechanisms active during auditory-only and audiovisual speech processing in CI users and determine how they differ from NH perceivers. Postlingually deaf experienced CI users and age-matched NH adults completed syllable and word recognition tasks during EEG recording, and the neural data was analyzed for differences in event-related potentials and neural oscillations. The results showed that during phonemic processing in the syllable task, CI users have stronger AV integration, shifting processing away from primary auditory cortex and weighting the visual signal more strongly. During whole-word processing in the word task, early acoustic processing is preserved and similar to NH perceivers, but again displaying robust AV integration. Lipreading ability also predicted suppression of early auditory processing across both CI and NH participants, suggesting that while some neural reorganization may have occurred in CI recipients to improve multisensory integrative processing, visual speech ability leads to reduced sensory processing in primary auditory cortex regardless of hearing status. Findings further support behavioral evidence for strong AV integration in CI users and the critical role of vision in improving speech perception.