Detailseite
Projekt Druckansicht

Multisensorische Wahrnehmung und View Management in Augmented Reality

Fachliche Zuordnung Bild- und Sprachverarbeitung, Computergraphik und Visualisierung, Human Computer Interaction, Ubiquitous und Wearable Computing
Förderung Förderung von 2017 bis 2021
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 320128462
 
Erstellungsjahr 2021

Zusammenfassung der Projektergebnisse

Augmented reality (AR) refers to the combination of real and virtual information by adding (“augmenting”) extra layers of digital information onto real-world imagery. This information commonly is of visual nature. Next to handheld devices, head-worn displays are a common medium to display augmented information. However, display space (“field-of-view”, FOV) is generally limited in head-worn devices and current devices augment only a small part of the human vision. As a result, clutter, occlusion and difficulties in information perception and processing are typical challenges when displaying information in narrow AR FOV displays. The project aimed to address these issues by encoding visual information in other sensory channels. To gain a better understanding of how restricting the FOV affects perception and cognitive processing in AR we performed a number of studies on visual search with a restricted FOV. Results indicate how different degrees of FOV restrictions and information density levels can slow down visual search and affect visual search behavior for text and pictorial labels. We further developed and tested different approaches of audio and vibration cues on the head. These cues helped users finding digital information which were surrounding them without depending on additional visual cues by guiding them to information using the audio-tactile cues. We found that users were able to judge the information´s position with our reference method well with only minor deviations between judged position and real location of the information. Also, users were significantly faster and more accurate compared to other developed methods. As a next step, we tested our method resulting from our previous publication against a popular visual-based guidance method called EyeSee360 to discover the differences regarding their effectiveness on AR search and influence on situation awareness. We showed that our method using audio and vibration cues - although generally slower - is still performing comparably well against EyeSee360. However, the main advantage of using our method is that it provides a significant improvement in situation awareness over the visual approach. Looking further ahead, users often divide their attention between different tasks when interacting in AR. In such situations it can easily happen that relevant information is missed, especially in information-rich and dynamic environments. To address the problem, we designed and evaluated a combined feedback of visual, audio and tactile cues for our multisensory head-mounted system. The feedback provides information on the location of an object of interest that is outside the AR FOV and alerts the user if the respective object enters the FOV. The comparison of different cue modalities showed that short vibrations at the head are particularly useful to promote fast reactions and are still well perceived under conditions of visual and auditory noise. Overall, reflecting on the project, users reacted very positively to non-visual feedback, and were able to learn and use metaphors effectively in a short time. In terms of accuracy, non-visual methods were able to compete with highly trained visual techniques. The question is whether non-visual feedback can also keep up with visual methods in AR scenarios in terms of speed when studied over a longer period of time with regular training. Somewhat surprisingly, tactile feedback generally received a particularly positive response throughout our studies despite the dominance of audio-visual media in today's society. Before the first exploratory tests we were also not sure how high the acceptance of administering tactile stimuli to the head would be. Fortunately, these concerns also proved to be completely harmless what further motivates the pursuit of integrating tactile stimuli into headmounted AR systems.

Projektbezogene Publikationen (Auswahl)

  • Audio-Tactile Proximity Feedback for Enhancing 3D Manipulation. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST), 2019
    Marquardt, A., Kruijff, E., Trepkowski, C., Maiero, J., Schwandt, A., Hinkenjann, A., Stuerzlinger, W., Schoening, J.
    (Siehe online unter https://doi.org/10.1145/3281505.3281525)
  • The Influence of Label Design on Search Performance and Noticeability in Wide Field of View Augmented Reality Displays. In IEEE Transactions on Visualization and Computer Graphics (TVCG), 2018
    Kruijff, E., Orlosky, J., Kishishita, N., Trepkowski, C., Kiyokawa, K.
    (Siehe online unter https://doi.org/10.1109/TVCG.2018.2854737)
  • Non-Visual Cues for View Management in Narrow Field of View Augmented Reality Displays. In Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 2019
    Marquardt, A., Trepkowski, C., Eibich, T., Maiero, J., Kruijff, E.
    (Siehe online unter https://doi.org/10.1109/ISMAR.2019.000-3)
  • The Effect of Narrow Field of View and Information Density on Visual Search Performance in Augmented Reality. In Proceedings of the IEEE conference on Virtual Reality (VR), 2019
    Trepkowski, C., Eibich, D., Maiero, J., Marquardt, A., Kruijff, E., and Feiner, S.
    (Siehe online unter https://doi.org/10.1109/VR.2019.8798312)
  • Comparing Non- Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays. IEEE Transactions on Visualization and Computer Graphics (TVCG), Volume 26, Issue 12, P 3389-3401, 2020
    Marquardt, A., Trepkowski, C., Eibich, T. Maiero, J., Kruijff, E., Schöning, J.
    (Siehe online unter https://doi.org/10.1109/TVCG.2020.3023605)
 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung