Project Details
Projekt Print View

Privacy-Preserving Interaction with On-Body Computers

Subject Area Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term since 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 521601028
 
New types of body-worn devices promise to offer new, scalable user interfaces that are more intuitive to use, more direct to access, and, contrary to existing handheld devices, compatible with demanding mobile activities. However, existing means for body-based input and output pose serious new risks to the privacy of users: The large mid-air hand and finger gestures which are typically used for input are considerably less resilient against observation attacks than established touch input on smartphones. To an even larger extent, visual output on the body is inherently observable by third parties. This is particularly problematic given that body-worn devices are typically used during mobile activities, which take place in social or semi-social settings, and often in public spaces. The primary goal of this project is to contribute to the scalability of on-body computing to real-world social settings by developing interaction techniques for private information input and output, which offer improved resilience to privacy violations. Central to our approach is that we aim to leverage the unique interactional properties of the human body: high manual dexterity, high tactile sensitivity, and a large available surface for input and output, paired with the possibility to flexibly shield input and output using variable body posture. These properties can form the basis for a new set of body-based input and output techniques that are scalable and (practically) unobservable. This goal is largely unexplored so far. It is very demanding because of the new and highly varied form factors and scales of on-body devices and the substantially novel forms of multimodal input and output, which are further complicated by the inherent complexity of social settings in terms of interpersonal configurations, the respective proxemics, and the attention of users and bystanders. To inform and map the interaction design space, we will empirically investigate the privacy of tactile input, visual output, and haptic output on various body locations, depending on body posture and collaborative proxemic configurations. We will then systematically conceptualize and implement body-based input gestures, as well as scalable techniques for multimodal body interaction, which preserve privacy in social settings in the light of a generalized threat model. We will base our findings on attention models that reference a taxonomy of the human body, making it more accessible for formalization. The interactions will be empirically evaluated with users in realistic scenarios and in the lab, to assess how their constituent properties influence usability, privacy, and scalability. Both will help us understand the internal and external validity of our approach. We expect that the results of this project will substantially contribute to laying the foundations of scalable body-based interactions that preserve privacy.
DFG Programme Priority Programmes
 
 

Additional Information

Textvergrößerung und Kontrastanpassung