Project Details
Projekt Print View

SATOP- Situation Awareness during Teleoperation

Subject Area Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Human Factors, Ergonomics, Human-Machine Systems
Term from 2015 to 2022
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 272130626
 
Despite the many advantages of teleoperation systems, work with most current systems is oftentimes challenging. In fact, some have gone so far as to say that teleoperation “traditionally suffers from suboptimal performance and limited situation awareness” (e.g. Kuiper, Frumau, van der Helm & Abbink, 2013, p. 1822), thus emphasizing the need to research possible causes of performance problems and establish evidence-based guidelines for the design of human-machine interfaces that improve performance. The main objective of the SATOP project is to examine empirically the effects of multi-modal feedback cues on situation awareness and task performance in teleoperation. Specifically, the effects of haptic, auditory and visual sensory cues and their bi- and tri-modal combinations are of interest. In phase 1 of the project SATOP (Jan 2016-Dec 2017), an experimental setup featuring a simulation of a teleoperated mobile robot in rugged terrain had been implemented and experiments were conducted to investigate the effects of multi-modal cues of direction on direction localization and navigation performance as well as various measures of situation awareness. The results obtained thus far indicate ambiguous effects, e.g. adding auditory and/or haptic direction cues to a visual direction cue does not significantly improve localization performance. On the other hand, adding an auditory direction cue to a haptic cue significantly deteriorates localization performance. The results point towards possible underlying mechanisms of multi-sensory perception that might explain the differences in effects of multi-modal direction cues on task performance, and also predict differential effects on situation awareness. As of yet, research on the formulation and evaluation of models that predict task performance and situation awareness in teleoperation on the basis of multi-sensory cue processing mechanisms are virtually non-existent. As a result, it is currently exceedingly difficult, if not impossible, to predict accurately under which circumstances and for which tasks different feedback modalities and combinations would likely improve task performance and/or situation awareness. Hence, the aim of the proposed phase 2 of project SATOP is to investigate the concept of Bayesian Sensory Integration in teleoperation. In an effort to explain and predict differential effects of multi-modal feedback, three experiments and a statistical meta-analysis are planned to investigate systematically whether the proposed multi-sensory integration mechanism can account for observed differences in the effects of multi-modal direction cues on teleoperation performance and situation awareness. On the basis of the empirical results, a model of multi-sensory information processing and a corresponding taxonomy of multi-modal feedback is to be established, on the basis of which guidelines for the design of effective human-machine interfaces for teleoperation systems are derived.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung