Project Details
Projekt Print View

SimGest -- Simulation of Scalable Gestures for Human-Computer Interaction

Subject Area Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term since 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 521585176
 
Gesture control is a type of Human-Computer Interaction in which no pointer is moved to a virtual object, but the user's movement itself represents the interaction. Nowadays, surface gestures are a common interaction modality on mobile phones and large interactive whiteboards, but with the advances in Virtual and Augmented Reality and the increasing availability of Head-Mounted Displays, spatial gestures are becoming more and more important. Especially use cases covering uncommon scenarios introduce additional constraints into gesture design and require to create an interaction concept taking contextual factors and other aspects into account. Software developers, however, face a variety of challenges when developing gesture-based applications. Gestures must be recognizable, i.e., the user's movements must be matched with known gestures and the application must react accordingly. Gestures must be robust, i.e., variations in performance between different users must not affect the application's reaction. Gestures must also fit the device, whereby different factors such as the posture of the device play a role. Most notably, gestures must fit the user, i.e., the ergonomics, memorability, and semantic ambiguity of the gestures, as well as the motor skills, must be taken into account during development as they heavily influence the individual User Experience. And gestures must scale and adapt to the user's context and environment, i.e., the current situation the user is in, e.g., social setting, location, or current task. These challenges must be met not only with suitable development tools: above all, they require intensive testing of the application in general and the interaction modalities in particular. Testing the interaction must furthermore include to generate inputs and examine the output, i.e., performing real gestures and examining whether the application reacts as expected. Since manual testing is time-consuming and costly, test automation is desirable, but requires gesture simulation capabilities that are not available to the necessary extent. In particular, as gestures are fuzzy, gesture simulation must generate gestures that are distorted regarding the variances resulting from different user groups and the aspects discussed above. And as the state of the art progresses into more sophisticated types of gesture-based interaction, e.g., with smart textiles or other appliances, testing capabilities must scale with the gestures. In this project, we aim to develop the necessary foundation work through empirical studies and findings means for simulation gesture performances from different user groups.
DFG Programme Priority Programmes
 
 

Additional Information

Textvergrößerung und Kontrastanpassung