Project Details
Projekt Print View

Multistable perception across modalities: Resolving sensory ambiguity in vision and audition

Subject Area General, Cognitive and Mathematical Psychology
Term from 2017 to 2021
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 325414673
 
Final Report Year 2024

Final Report Abstract

Multistability refers to a situation when the subjective perceptual interpretation of a sensory stimulus alternates over time. Such phenomena are well-known in vision and audition, but very little research so far had been dedicated to the commonalities and interactions between multistability in the two sensory modalities. The research conducted in the project has begun to close this gap. For specific classes of multistable stimuli, we found strong correspondences between audition and vision. For example, individuals who tend to see two overlaid drifting gratings as integrated plaid pattern more frequently than others, also tend to perceive two simultaneously presented tone sequences more frequently as integrated “melody” pattern. Moreover, this also applies moment-by-moment: at times when the integrated auditory pattern is perceived, the visual pattern is also more likely to be perceived as integrated than when the individual sound streams perceptually segregate. This extends to a semantic level: we found that perceiving a specific word among multiple possible alternatives, strengthens the perception of the visual stimulus corresponding to this word. On a physiological level, we showed that auditory perceptual transitions evoke pupil dilation to which the execution of the behavioural response strongly contributes, a pattern well-established in vision. Obtaining results like these has only been possible by developing and applying novel techniques that either allow assessing subjective perception in at least one modality without the need for the participant to provide overt report or allow unobtrusively measuring other perceptual phenomena (e.g., the perceptual processing of a probe), while participants report their perception. We therefore developed and refined such “no-report” methods, based on eyetracking and electroencephalography (EEG) measurements. Moreover, we developed and validated new stimulus sets, in particular with respect to verbal transformation; that is, we identified German words that induce the perception of additional meanings when presented repeatedly. In sum, during the project we made substantial progress on the conceptual and the methodological level, achieved a better understanding of multistability in vision and audition, and demonstrated several cross-modal effects, dependencies and correspondences that had been unknown at the project’s onset. Although this project is fundamental research, the importance of crossmodal correspondences extends far beyond the realm of multistability. A better understanding of the coupling between vision and audition will eventually allow a better understanding of scene analysis and thereby make a contribution to supporting perception in conditions that are challenging in general or for individuals.

Publications

 
 

Additional Information

Textvergrößerung und Kontrastanpassung