Project Details
Projekt Print View

Decoding multimodal information integration: neural dynamics, computational models, and the role of sleep

Subject Area Human Cognitive and Systems Neuroscience
Term since 2026
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 572233802
 
Humans continuously process multisensory input to guide decisions in a complex environment. This raises questions about how different sensory modalities are integrated into coherent, generalizable knowledge structures. While previous research has focused on multisensory integration in perceptual contexts, it remains unclear how this integration unfolds during more abstract cognitive processes, such as category learning and rule inference. The proposed project aims to address these questions by investigating how multimodal sensory information is integrated into hierarchical category representations. To this end, participants will engage in a category learning task that uses naturalistic video stimuli depicting everyday coffee shop scenes, each defined by three feature dimensions: auditory (music), static visual (interior design), and dynamic visual (activity). Each feature dimension can take on five different values, which are ranked on a scale from 1 to 5. Participants repeatedly choose the higher-valued coffee shop scene in a two-alternative forced-choice task, receiving feedback to guide learning. The evolving learning trajectories will be modelled using a reinforcement learning framework, enabling an assessment of how participants weigh and integrate the different sensory dimensions. This computational approach will allow dissociation of exemplar-based memorization strategies from rule-based feature integration. To capture the neural mechanisms supporting multimodal integration, functional MRI will track changes in neural activation patterns as participants acquire the categorization rule, while diffusion-weighted MRI will assess learning-related microstructural plasticity. Previous research has shown that sleep benefits rule inference in decision tasks, including information integration category learning. This project will test whether and how sleep supports the abstraction of rules across modalities by contrasting performance in participants who stay awake vs. sleep after training. The role of aperiodic EEG activity and oscillatory markers will be tested on behavioral performance and neural representations after sleep. By tracing the development of multimodal category representations across learning and consolidation, the project seeks to characterize the neural dynamics and computational principles underlying multimodal rule inference. It further aims to test whether sleep actively reorganizes knowledge representations to enhance integration across sensory modalities. These findings are expected to advance understanding of the mechanisms supporting adaptive learning and decision-making in naturalistic environments and to inform neuro-inspired models of artificial intelligence that emulate human multimodal cognition.
DFG Programme Position
 
 

Additional Information

Textvergrößerung und Kontrastanpassung