Audiovisual speech perception in cochlear implant recipients
Final Report Abstract
This project focused on the fact that verbal communication in everyday life is based on multimodal information. Particularly for users of cochlear implants (CI), visual speech information is important in addition to the auditory signal. So far, the mechanisms of multisensory speech perception have been insufficiently understood and given little consideration in everyday clinical practice. The project made a significant contribution to this topic by developing extensive material for audiovisual stimulation based on virtual reality (VR) techniques. On this basis, various aspects of audiovisual speech perception while using CIs could be investigated. These included the speech recognition in acoustically difficult situations, mechanisms of cortical reorganisation and the perception of prosodic features. Numerous VR-based stimuli were generated that provide speech material for syllable, single word, matrix sentences and everyday sentences test procedures. In addition, virtual characters were generated to analyse the identification of accented words in sentences, which include facial and head movements as well as articulation movements of the mouth. With regard to speech recognition in acoustically difficult situations, a significant benefit was demonstrated with these materials compared to auditory-only speech presentation. In bimodal (CI plus hearing aid) CI users, this gain was independent of whether the speech signal was supplied electrically or acoustically, so that visual features provided complementary information to both types of transmission. With regard to cortical reorganisation, the results of the EEG studies show additional activation in the visual cortex and delayed activation in the auditory cortex for CI users compared to subjects with normal hearing. When comparing the time course of cortical audiovisual speech processing between CI patients with a bilateral and those with a unilateral hearing loss, a stronger visual influence on auditory speech processing was shown for the former group, with both groups showing additional recruitment of the visual cortex and improved lip-reading ability compared to subjects with normal hearing. In the context of prosody perception, it was found that CI recipients are better able than normalhearing subjects to use visual features such as eyebrow and head movements to identify accented words within a sentence. However, this could not necessarily be explained by an integration of the two modalities in the analysed population. Rather, there were subgroups of CI users who primarily used either auditory or visual features to identify the accented words. In summary, the project has provided significant further insights into the processing of audiovisual speech in CI patients, which relate to different facets of speech perception. Furthermore, due to the stimuli and methods established in the project, it represents an important foundation for future studies on multimodal speech processing.
Publications
-
Differences in audiovisual speech processing in CI users with unilateral and bilateral hearing loss: an ERP study. International Congress of Cognitive Neuroscience (ICON), Juni 2020, Helsinki, Finnland
Layer N., Weglage A., Müller V., Meister H., Lang-Roth R., Walger M., Murray M.M. & Sandmann P.
-
Bimodal and audiovisual benefits of speech intelligibility in cochlear implant listeners with contralateral acoustic hearing. Conference: Conference on Implantable Auditory Prostheses (CIAP), Juli 2021, online
Ihly P., Meister H., Hollfelder D., Bruchhage K. & Jürgens T.
-
Electrophysiological correlates of audiovisual speech perception in CI patients. XXVII Symposium IERASG, Juni 2021, Köln, Deutschland
Layer N., Weglage A., Müller V., Walger M., Lang-Roth R., Meister H. & Sandmann P.
-
A method for examining audiovisual prosody perception based on a virtual human. Conference on Computational Audiology, 30.6.- 1.7. 2022
Abdellatif, K. H., Winter, I. S., Wächtler, M., Sandmann, P. & Meister, H.
-
A virtual reality-based method for examining audiovisual prosody perception.
Meister, H., Winter, I. S., Waechtler, M., Sandmann, P. & Abdellatif, K.
-
Electrophysiological differences and similarities in audiovisual speech processing in CI users with unilateral and bilateral hearing loss. Current Research in Neurobiology, 3, 100059.
Layer, Natalie; Weglage, Anna; Müller, Verena; Meister, Hartmut; Lang-Roth, Ruth; Walger, Martin; Murray, Micah M. & Sandmann, Pascale
-
The timecourse of multisensory speech processing in unilaterally stimulated cochlear implant users revealed by ERPs. NeuroImage: Clinical, 34, 102982.
Layer, Natalie; Weglage, Anna; Müller, Verena; Meister, Hartmut; Lang-Roth, Ruth; Walger, Martin; Murray, Micah M. & Sandmann, Pascale
-
Examination of audiovisual prosody in cochlear implant recipients. 184th meeting of the acoustical society of America, 8.-12. Mai 2023, Chicago, IL, USA
Meister H. et al.
