Project Details
Projekt Print View

Cortical and behavioural measures of active communication

Subject Area Acoustics
Biological Psychiatry
Medical Physics, Biomedical Technology
Term since 2020
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 444761144
 
Our understanding of auditory speech processing and attention mainly stems from controlled laboratory experiments using simplified, artificial stimuli that differ in complexity and predictability from real-life scenarios. While the influence of visual cues such as lip-reading on auditory speech processing is well known, audio-visual processing in complex communication scenarios such as social interactions has not received much attention. Real-life communication involves interactive loops with visual cues like body movements, gestures, facial expressions, and eye blinks, which are not well captured in traditional lab-based experiments that treat a listening person as a passive receiver. We develop interactive virtual environments (VEs) to advance the field, as VEs offer the opportunity to implement and use reproducible realistic audio-visual scenarios. Our VEs combine the goals of replicability and reproducibility with the complexity of everyday communication demands. Our aim is to advance our understanding of audio-visual speech processing by developing and evaluating VEs that mimic at least some of the complexity of natural communication situations and include the listener as an active agent. On a neural level, the temporal dynamics of auditory attention to speech can be non-invasively explored by means of electroencephalography (EEG). Cortical speech tracking with EEG was implemented in the first period of this project in non-interactive VEs and allowed us to objectively evaluate attentive listening. Specifically, we found that cortical speech tracking produced valid results for unscripted but pre-recorded stories presented in a VE. In the second period of this project, we will advance our VEs to simulate real and telepresence conditions. These VEs enable the study of live, interactive communication scenarios. In Study 1, we will compare passive listening with active conversation in a dyadic communication scenario and manipulate conversation involvement (active vs. passive) and visual representation of interlocutors (real presence vs. telepresence). In Study 2, we will explore auditory processing and interpersonal synchrony between familiar and unfamiliar interlocutors using triadic turn-taking communication scenarios. We will introduce the factor of interlocutor familiarity and investigate the influence of visual interlocutor representation (real presence vs. telepresence) on auditory processing and interpersonal synchrony. Study 3 will investigate auditory processing and interpersonal synchrony in triadic turn-taking situations in the presence of audio-visual distractor scenarios, thereby manipulating communication difficulty. In all three studies, we will use EEG as a non-invasive means of monitoring attention to speech and non-speech signals. In addition, measures of behavioral synchrony between conversing individuals will be developed to predict communication effort.
DFG Programme Priority Programmes
Co-Investigator Dr. Giso Grimm
 
 

Additional Information

Textvergrößerung und Kontrastanpassung