Detailseite
Projekt Druckansicht

Interaktive Lokomotionbenutzerschnittstellen zum Realen Gehen durch Virtuelle Welten - Von der Wahrnehmung zur Anwendung

Fachliche Zuordnung Arbeitswissenschaft, Ergonomie, Mensch-Maschine-Systeme
Förderung Förderung von 2009 bis 2023
Projektkennung Deutsche Forschungsgemeinschaft (DFG) - Projektnummer 137297816
 
Erstellungsjahr 2021

Zusammenfassung der Projektergebnisse

Real walking is the most natural way to locomote in immersive virtual environments (IVEs), but a confined physical walking space limits its applicability. In particular, real walking in virtual reality (VR) is limited by the physical space in which the VR users can be tracked such that natural walking in VR is only possible by perceptually-inspired locomotion techniques such as redirected walking (RDW). RDW is a collection of techniques to solve this problem. One of these techniques aims to imperceptibly rotate the user’s view of the virtual scene to steer her along a confined path whilst giving the impression of walking in a straight line in a large virtual space. In the iLUI project, we designed, developed and evaluated novel RDW techniques focussing on (i) perceptual adaption, (ii) gaze-based as well as (iii) feet-based RDW, and provided (iv) adaptive controllers, algorithms, and applications. In the first work package we showed that perceptual thresholds for detection of RDW are plastic and can be changed by applying RDW in a consistent manner. This allows to increase the effectiveness of RDW by manipulating the perceptual threshold. In the second work package we implemented and tested ways of using eye tracking in HMDs to imperceptably reorient the user in RDW by focusing on periods such as blinks and saccades when perceptual thresholds for scene changes are elevated. In the third work package we manipulated the user’s body information to morph the virtual body relative to the actual real-world body pose and determine the user’s sensitivity to such discrepancies. We developed two different strategies to cope with the challenge of inconsistencies caused by camera manipulation and body feedback, in particular from vision of the user’s feet. In the fourth work package we developed algorithms and software controllers for RDW setups that incorporate the novel approaches addressed in iLUI project, which rely on information about gaze as well as body poses. Our approach leverages bent paths in a way that can provide undetectable RDW manipulations even in room-scale VR. The software is available as Open Source Github project, called Space-Extender. The work performed in the project introduced novel approaches to significantly decrease the space requirements of RDW such that virtual omni-directional walking through arbitrary VEs becomes possible, even if only a small physical walking space is available, such as in a CAVE or in smaller lab environments.

Projektbezogene Publikationen (Auswahl)

 
 

Zusatzinformationen

Textvergrößerung und Kontrastanpassung