Multistable perception across modalities: Resolving sensory ambiguity in vision and audition
Final Report Abstract
Multistability refers to a situation when the subjective perceptual interpretation of a sensory stimulus alternates over time. Such phenomena are well-known in vision and audition, but very little research so far had been dedicated to the commonalities and interactions between multistability in the two sensory modalities. The research conducted in the project has begun to close this gap. For specific classes of multistable stimuli, we found strong correspondences between audition and vision. For example, individuals who tend to see two overlaid drifting gratings as integrated plaid pattern more frequently than others, also tend to perceive two simultaneously presented tone sequences more frequently as integrated “melody” pattern. Moreover, this also applies moment-by-moment: at times when the integrated auditory pattern is perceived, the visual pattern is also more likely to be perceived as integrated than when the individual sound streams perceptually segregate. This extends to a semantic level: we found that perceiving a specific word among multiple possible alternatives, strengthens the perception of the visual stimulus corresponding to this word. On a physiological level, we showed that auditory perceptual transitions evoke pupil dilation to which the execution of the behavioural response strongly contributes, a pattern well-established in vision. Obtaining results like these has only been possible by developing and applying novel techniques that either allow assessing subjective perception in at least one modality without the need for the participant to provide overt report or allow unobtrusively measuring other perceptual phenomena (e.g., the perceptual processing of a probe), while participants report their perception. We therefore developed and refined such “no-report” methods, based on eyetracking and electroencephalography (EEG) measurements. Moreover, we developed and validated new stimulus sets, in particular with respect to verbal transformation; that is, we identified German words that induce the perception of additional meanings when presented repeatedly. In sum, during the project we made substantial progress on the conceptual and the methodological level, achieved a better understanding of multistability in vision and audition, and demonstrated several cross-modal effects, dependencies and correspondences that had been unknown at the project’s onset. Although this project is fundamental research, the importance of crossmodal correspondences extends far beyond the realm of multistability. A better understanding of the coupling between vision and audition will eventually allow a better understanding of scene analysis and thereby make a contribution to supporting perception in conditions that are challenging in general or for individuals.
Publications
-
Intraindividual Consistency Between Auditory and Visual Multistability. Perception, 49(2), 119-138.
Einhäuser, Wolfgang; da Silva Lucas, F. O. & Bendixen, Alexandra
-
Parameter dependence in visual pattern-component rivalry at onset and during prolonged viewing. Vision Research, 182, 69-88.
Wegner, Thomas G.G.; Grenzebach, Jan; Bendixen, Alexandra & Einhäuser, Wolfgang
-
Pupillometry in auditory multistability. PLOS ONE, 16(6), e0252370.
Grenzebach, Jan; Wegner, Thomas G. G.; Einhäuser, Wolfgang & Bendixen, Alexandra
-
Low-high-low or high-low-high? Pattern effects on sequential auditory scene analysis. The Journal of the Acoustical Society of America, 152(5), 2758-2768.
Thomassen, Sabine; Hartung, Kevin; Einhäuser, Wolfgang & Bendixen, Alexandra
