Detailseite
Projekt Druckansicht

Crossmodale Temporale Integration in der Scheinbewegung

Fachliche Zuordnung Allgemeine, Kognitive und Mathematische Psychologie
Förderung Förderung von 2011 bis 2014
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 194001222
 
Erstellungsjahr 2015

Zusammenfassung der Projektergebnisse

The focus of this project is on dynamic crossmodal temporal integration of multisensory information, particular on motion perception. We used both implicit measure of temporal processing by applying Ternus apparent motion, and explicit measure by duration reproductions. In the first stage of the research, two important factors, crossmodal interval and perceptual grouping, in multisensory temporal integration have been identified. Several studies brought convergent evidence that crossmodal temporal integration determines the temporal ventriloquist effect. Asymmetric crossmodal or intramodal perceptual grouping, on the other hand, may abolish the temporal ventriloquist effect. In addition, Interval (duration) integration plays a critical role in crossmodal apparent motion and sensorimotor timing, too. The reproduced duration, for example, is a combination and mix of motor and perceptual time. The weights of perceptual and motor time depend on the variability of correspondent estimates. Moreover, when sensory feedback delay is introduced, the reproduced duration is then heavily relied on the onset of the feedback, as well as the offset of motor action. Using quantitative measures and Bayesian approaches, crossmodal temporal integration has been shown to follow the MLE model with some modifications. Incorporating biases explicitly in the model shows high prediction of MLE for crossmodal perceptual duration integration and sensorimotor duration reproduction. The results of the research project also raised various further research questions. One challenge issue in multisensory temporal integration is biased temporal estimates. It is a common knowledge that time perception can be easily distorted by a variety of factors. Given that time processing is distributed, differential biases in different sensory time estimates may cause an internal conflict of time representation. Our brain must continuously calibrate related sensory estimates to keep internal consistency. This kind of predictive errors calibrate internal priors have been proposed in generative Bayesian framework, which has been successfully predict various types of multisensory temporal integration. We summarized recent progress of multisensory temporal integration and calibration, and related approaches of Bayesian inference in our recent review paper.

Projektbezogene Publikationen (Auswahl)

 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung