Detailseite
Projekt Druckansicht

Interaktive Lokomotionbenutzerschnittstellen zum Realen Gehen durch Virtuelle Welten - Von der Wahrnehmung zur Anwendung

Fachliche Zuordnung Arbeitswissenschaft, Ergonomie, Mensch-Maschine-Systeme
Förderung Förderung von 2009 bis 2023
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 137297816
 
Erstellungsjahr 2021

Zusammenfassung der Projektergebnisse

Real walking is the most natural way to locomote in immersive virtual environments (IVEs), but a confined physical walking space limits its applicability. In particular, real walking in virtual reality (VR) is limited by the physical space in which the VR users can be tracked such that natural walking in VR is only possible by perceptually-inspired locomotion techniques such as redirected walking (RDW). RDW is a collection of techniques to solve this problem. One of these techniques aims to imperceptibly rotate the user’s view of the virtual scene to steer her along a confined path whilst giving the impression of walking in a straight line in a large virtual space. In the iLUI project, we designed, developed and evaluated novel RDW techniques focussing on (i) perceptual adaption, (ii) gaze-based as well as (iii) feet-based RDW, and provided (iv) adaptive controllers, algorithms, and applications. In the first work package we showed that perceptual thresholds for detection of RDW are plastic and can be changed by applying RDW in a consistent manner. This allows to increase the effectiveness of RDW by manipulating the perceptual threshold. In the second work package we implemented and tested ways of using eye tracking in HMDs to imperceptably reorient the user in RDW by focusing on periods such as blinks and saccades when perceptual thresholds for scene changes are elevated. In the third work package we manipulated the user’s body information to morph the virtual body relative to the actual real-world body pose and determine the user’s sensitivity to such discrepancies. We developed two different strategies to cope with the challenge of inconsistencies caused by camera manipulation and body feedback, in particular from vision of the user’s feet. In the fourth work package we developed algorithms and software controllers for RDW setups that incorporate the novel approaches addressed in iLUI project, which rely on information about gaze as well as body poses. Our approach leverages bent paths in a way that can provide undetectable RDW manipulations even in room-scale VR. The software is available as Open Source Github project, called Space-Extender. The work performed in the project introduced novel approaches to significantly decrease the space requirements of RDW such that virtual omni-directional walking through arbitrary VEs becomes possible, even if only a small physical walking space is available, such as in a CAVE or in smaller lab environments.

Projektbezogene Publikationen (Auswahl)

  • (2016): Visual blur in immersive virtual environments: does depth of field or motion blur affect distance and speed estimation? Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST), pp. 241-250
    Langbehn, E., Raupp, T., Bruder, G., Steinicke, F., Bolte, B., Lappe, M.
    (Siehe online unter https://doi.org/10.1145/2993369.2993379)
  • (2017): The accuracy and precision of oosition and orientation tracking in the HTC Vive virtual reality system for scientific research. i-Perception, 8(3), pp. 1-23
    Niehorster, D.C., Li, L., Lappe, M.
    (Siehe online unter https://doi.org/10.1177/2041669517708205)
  • Bending the curve: Sensitivity to bending of curved paths and application in room-scale VR. IEEE Transactions on Visualization and Computer Graphics, 23(4):1389–1398, 2017
    E. Langbehn, P. Lubos, G. Bruder, and F. Steinicke
    (Siehe online unter https://doi.org/10.1109/tvcg.2017.2657220)
  • (2018): 15 years of research on redirected walking in immersive virtual environments, IEEE Computer Graphics and Applications 38(2), pp. 44–56
    Nilsson, N.C., Peck, T., Bruder, G., Hodgson, E., Serafin, S., Whitton, M., Steinicke, F., Suma Rosenberg, E.
    (Siehe online unter https://doi.org/10.1109/MCG.2018.111125628)
  • (2018): In the blink of an eye - leveraging blink-induced suppression for imperceptible position and orientation redirection in virtual reality, ACM Transactions on Graphics, 37(4), pp. 1-11
    Langbehn, E., Steinicke, F., Lappe, M. Welch, G. F., Bruder, G.
    (Siehe online unter https://doi.org/10.1145/3197517.3201335)
  • (2019) Shrinking circles: adaptation to increased curvature gain in redirected walking. IEEE Transactions on Visualization and Computer Graphics, pp. 1-8
    Bölling, L., Stein, N., Steinicke, F., and Lappe, M.
    (Siehe online unter https://doi.org/10.1109/TVCG.2019.2899228)
  • (2020): Detection thresholds for vertical gains in vr and drone-based telepresence systems, Proceedings of the IEEE Conference on Virtual Reality and 3D User Interfaces (VR), pp. 101-107
    Matsumoto, K., Langbehn, E., Narumi, T., Steinicke, F.
    (Siehe online unter https://doi.org/10.1109/VR46266.2020.00028)
  • (2020): Walking by cycling - A novel in-place locomotion user interface for seated virtual reality experiences, Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI), pp. 1-12
    Freiwald, J.P., Ariza, O., Janeh, O., Steinicke, F.
    (Siehe online unter https://doi.org/10.1145/3313831.3376574)
  • (2021): A comparison of eye tracking latencies among several commercial head-mounted displays. i-Perception, 2021, 12 (1), 1-16
    Stein, N., Niehorster, D. C., Watson, T., Steinicke, F., Rifai, K., Wahl, S., Lappe, M.
    (Siehe online unter https://doi.org/10.1177/2041669520983338)
 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung