Project Details
Projekt Print View

Acoustical Awareness, Orientation, and Navigation in Rooms

Applicant Dr. Stephan Ewert, since 8/2021
Subject Area General, Cognitive and Mathematical Psychology
Acoustics
Term from 2018 to 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 406277157
 
Our awareness of space, and subsequent orientation and navigation in space is dominated by the visual system, relying on the high-resolution topographic representation of space on the retinae of our eyes. Nevertheless, our auditory system can contribute: While the visual field is limited to the viewing direction, spatial hearing is omnidirectional, often guiding head- and eye orientation. While this auditory contribution may be rather small in sighted persons, visually impaired humans are likely to rely much more on auditory cues. An extreme case of the use of auditory cues is reflected by some blind humans’ ability to navigate using echolocation. However, even without emitting dedicated sounds (e.g. mouth clicks for echolocation), humans are able to extract information about their surrounding space from the way external sounds are affected by surfaces and objects, generating early reflections and later reverberation in enclosed spaces.This research proposal is designed to formally quantify the ability of humans to orient in and navigate through space by means of i) combined audio-visual cues, ii) without visual cues but with normal and magnified audio cues, and iii) with only echo-acoustic cues in humans that are trained in echolocation. We will recruit virtual audio-visual environments with speaker arrays or headphones and head-mounted displays for fully controlled (echo)-acoustic and visual cues. From these baseline quantifications we will develop acoustic augmented reality solutions and evaluate their effectiveness in virtual environments aiming to optimize acoustic cues that mediate spatial information.The proposed project strongly benefits from the complementary expertise of the applicants, particularly in the area of room acoustic simulation and signal processing, and echolocation, as well as from the common interest in understanding binaural hearing and its role in challenging real-world situations. The common expertise in psychoacoustics and modelling will, together with the applied virtual reality techniques, allow for delineating auditory-perceptual cues utilized by the subjects. While sharing common techniques and procedures, the complementary work programme proposed for the two labs will focus on virtual acoustics and acoustic augmented reality and reverberation and human echolocation.The project will provide new insights on the extent to which listening to rooms may facilitate visually impaired subjects’ mobility and show which perceptual features are exploited by the subjects. Moreover, the project will provide rehabilitative and assistive acoustic technologies for people with visual impairments.
DFG Programme Research Grants
Ehemaliger Antragsteller Professor Dr. Lutz Wiegrebe, until 8/2021 (†)
 
 

Additional Information

Textvergrößerung und Kontrastanpassung