Project Details
ContextualEYEze: The interplay of gaze and context information in person perception, social understanding, and decision-making
Applicant
Professorin Dr. Anne Böckler-Raettig
Subject Area
General, Cognitive and Mathematical Psychology
Term
since 2024
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 550926152
During the last decades, researchers have become increasingly interested in the attentional, affective, and cognitive processes that enable human coordination and cooperation. A promising avenue in the pursue of this endeavor is to investigate basic mechanisms of social cognition in increasingly complex social settings. One such basic facet is the incredible human sensitivity to conspecifics’ eyes. Because the context-dependency of direct gaze effects is typically ignored in controlled lab experiments, the functional significance of direct gaze in realistically complex social encounters remains unclear and controversial. The current proposal addresses the integration of gaze and emotional context information in real life-like and dynamic, yet experimentally controlled paradigms. The importance of this question for the field of social cognition has been repeatedly emphasized. Work package 1 expands my recently developed Zoom task, in which participants (acting as “third party”) observe dyadic online conversations between listeners (visible) and narrators (audible). This setup allows the systematic and comprehensive manipulation of gaze behavior in the listener (e.g., according to gaze direction and gaze shifts) and emotional context (in the form of sad, joyful, embarrassing, or neutral narrations). First publications and pilot data demonstrate the potential of this approach and revealed that gaze behavior shapes person perception and prosocial decisions towards conversation partners (e.g., generosity) in a context-dependent manner. Work package 2 complements the gained insights by expanding the EmpaToM, a task I created and thoroughly validated to assess affective and cognitive components of social understanding (empathy and Theory of Mind) as well as prosocial decisions and gaze behavior. In this task, participants take the role of the listener (rather than the observer; “second party”) and view videos in which people recount autobiographic episodes. Accordingly, the EmpaToM allows the systematic and comprehensive manipulation of narrators’ gaze behavior and the emotional context (i.e., narration content). Conditions of the Zoom and the EmpaToM task that proved particularly insightful in the first work packages will be implemented in fMRI in work package 3. Neuroimaging data can clarify how gaze and context information influence each other during multimodal (audio-visual) social scene processing and how the neural underpinnings of this integration predict behavioral outcomes. Insights gained from all three work packages will further our understanding of the functional role of context-flexible gaze processing in social understanding and interaction.
DFG Programme
Research Grants