Detailseite
Projekt Druckansicht

Untersuchungen zum Beitrag des räumlichen Hörens zur auditorischen Szenenanalyse im auditorischen Cortex des sich verhaltenden Rhesusaffen

Antragsteller Dr. Peter Bremen
Fachliche Zuordnung Kognitive, systemische und Verhaltensneurobiologie
Förderung Förderung von 2013 bis 2016
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 246329269
 
Erstellungsjahr 2016

Zusammenfassung der Projektergebnisse

The integration of information from all our senses is of obvious benefit for the detection and identification of objects embedded in complex, every-day environments. For example, watching your conversation partner’s lip movements while listening to her at a crowded and noisy cocktail party improves speech understanding. In the last decades three principles have been uncovered based on behavioral and neurophysiological experiments that govern this multisensory integration: 1) In order to be integrated, stimuli from different senses need to be in close spatial vicinity to each other (principle of spatial coincidence). 2) The relative timing of multisensory stimuli modulates the strength of multisensory interactions (principle of temporal coincidence). 3) The strength of multisensory interaction is inversely related to the efficacy of the unimodal stimuli (principle of inverse effectiveness). Although neurons responsive to input from multiple senses have been described in various brain regions ranging from the (reptilian) midbrain to the (mammalian) neocortex, we do not know how and where in the brain the ‘binding of the senses’ takes place. That is, we are missing a unified theory as well as the neuronal correlates of uni- and multisensory perception. For example, we do not understand the role of 1) feedback connections from higher order associative cortical areas in modulating multisensory integration at lower order sensory cortices (top-down), and 2) cross-modal connections between sensory cortices (bottom-up). In this project and as a first step to address these issues, we have implemented two behavioral paradigms in non-human primates and humans that selectively engage neuronal networks involved in top-down and bottom-up processing. In the so-called focused attention paradigm the subject is cued to react to a change in e.g. a sound and to suppress a response to a change in luminance of a small light-emitting diode (LED). This paradigm is designed to probe the influence of top-down processes on sensory processing. In contrast, the redundant target paradigm is used to predominantly probe bottom-up processing. In this paradigm the subject is instructed to react to the first perceived change irrespective of modality, i.e. sound or LED. We found that 1) non-human primates and humans are capable of performing in these complex audiovisual paradigms, 2) they experience audiovisual integration as expressed by decreased reaction times and reaction time variability, 3) the above described principles govern audiovisual integration in both species, and 4) artificial audiovisual stimuli such as a combination of amplitude modulated sound and a dimming light can elicit audiovisual integration. The last point is noteworthy since the artificial stimuli used here allow for the study of audiovisual integration in controlled and well parameterized paradigms that nevertheless capture features of more realistic every-day situations such as speech understanding in crowded environments or with a hearing impairment. Applying these findings in neurophysiological single neuron studies will allow for the dissection of neuronal networks involved in task-dependent audiovisual integration. While our findings do not have immediate economic or societal applications, we foresee at least three areas that will benefit in the future: 1) Sensory rehabilitation. A better understanding of the ‘inverse effectiveness principle’ of multisensory integration will have impact on the efficiency of cochlear implants. 2) Building optimal multimedia displays. The multimedia-technology, speech and face recognition, as well as artificial intelligence and robotics sectors will profit from our research. 3) Understanding, and eventually treating, the diseased brain. Unravelling mechanisms of how the brain copes with uncertainty and selection in complex environments is relevant for recognizing (biomarkers), understanding, and treating cognitive disorders, such as autism (ASD) or ADHD.

Projektbezogene Publikationen (Auswahl)

  • A correlate of informational masking in anesthetized primary auditory cortex can be explained by basic neuronal tuning properties. Auditory Cortex Conference in Magdeburg, September 2014
    Bremen P, Middlebrooks JC
  • Double sound localization in elevation. Meeting of the Association of Research in Otolaryngology, February 2015
    Gross-Hardt R, Bremen P, Van Wanrooij MM, Van Opstal AJ
  • Primate saccade target selection relies on feedback competitive signal integration. Society for Neuroscience meeting, November 2015
    Goossens J, Kalisvaart J, Noest A, Van den Berg A, Massoudi A, Bremen P
  • Source segregation by spectra and space. Young Investigators Symposium at the meeting of the Association of Research in Otolaryngology, February 2015
    Bremen P, Middlebrooks JC
 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung