Detailseite
Projekt Druckansicht

Multisensorische Wahrnehmung und View Management in Augmented Reality

Fachliche Zuordnung Bild- und Sprachverarbeitung, Computergraphik und Visualisierung, Human Computer Interaction, Ubiquitous und Wearable Computing
Förderung Förderung von 2017 bis 2021
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 320128462
 
Erstellungsjahr 2021

Zusammenfassung der Projektergebnisse

Augmented reality (AR) refers to the combination of real and virtual information by adding (“augmenting”) extra layers of digital information onto real-world imagery. This information commonly is of visual nature. Next to handheld devices, head-worn displays are a common medium to display augmented information. However, display space (“field-of-view”, FOV) is generally limited in head-worn devices and current devices augment only a small part of the human vision. As a result, clutter, occlusion and difficulties in information perception and processing are typical challenges when displaying information in narrow AR FOV displays. The project aimed to address these issues by encoding visual information in other sensory channels. To gain a better understanding of how restricting the FOV affects perception and cognitive processing in AR we performed a number of studies on visual search with a restricted FOV. Results indicate how different degrees of FOV restrictions and information density levels can slow down visual search and affect visual search behavior for text and pictorial labels. We further developed and tested different approaches of audio and vibration cues on the head. These cues helped users finding digital information which were surrounding them without depending on additional visual cues by guiding them to information using the audio-tactile cues. We found that users were able to judge the information´s position with our reference method well with only minor deviations between judged position and real location of the information. Also, users were significantly faster and more accurate compared to other developed methods. As a next step, we tested our method resulting from our previous publication against a popular visual-based guidance method called EyeSee360 to discover the differences regarding their effectiveness on AR search and influence on situation awareness. We showed that our method using audio and vibration cues - although generally slower - is still performing comparably well against EyeSee360. However, the main advantage of using our method is that it provides a significant improvement in situation awareness over the visual approach. Looking further ahead, users often divide their attention between different tasks when interacting in AR. In such situations it can easily happen that relevant information is missed, especially in information-rich and dynamic environments. To address the problem, we designed and evaluated a combined feedback of visual, audio and tactile cues for our multisensory head-mounted system. The feedback provides information on the location of an object of interest that is outside the AR FOV and alerts the user if the respective object enters the FOV. The comparison of different cue modalities showed that short vibrations at the head are particularly useful to promote fast reactions and are still well perceived under conditions of visual and auditory noise. Overall, reflecting on the project, users reacted very positively to non-visual feedback, and were able to learn and use metaphors effectively in a short time. In terms of accuracy, non-visual methods were able to compete with highly trained visual techniques. The question is whether non-visual feedback can also keep up with visual methods in AR scenarios in terms of speed when studied over a longer period of time with regular training. Somewhat surprisingly, tactile feedback generally received a particularly positive response throughout our studies despite the dominance of audio-visual media in today's society. Before the first exploratory tests we were also not sure how high the acceptance of administering tactile stimuli to the head would be. Fortunately, these concerns also proved to be completely harmless what further motivates the pursuit of integrating tactile stimuli into headmounted AR systems.

Projektbezogene Publikationen (Auswahl)

 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung