Project Details
Illusionary Surface Interfaces
Subject Area
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term
from 2020 to 2024
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 425869442
If we consider Weiser’s vision that future computers will “weave themselves into the fabric of everyday life until they are indistinguishable from it", the question of interface design and how it relates to pervasive computers throws up some interesting challenges. Traditional computers typically refer (through visual affordances) to both the perceived and actual properties of the interface - suggesting not only fundamental functionalities but also determining how humans might possibly use the system. Such rich information visualization may, however, not suit the way we want pervasive computers to look and feel.We aim to exploit multisensory illusions in order to extend the range of interface properties that can be displayed, using only everyday object surfaces as interfaces. In a manner similar to the "rubber hand illusion", in which people can be induced to perceive a physical touch based purely on what they see, we will support visual and haptic feedback induced by augmented vision. Instead of changing the objects’ physicality, we will visually augment them using "smart glasses" and projectors, while at the same time augmenting them haptically by inducing multisensory illusion.Illusionary feedback when touching a surface provides great potential for enhancing pervasive interaction design through the addition of materiality. By applying this approach, we can induce the perception of smooth and stiff surfaces soft, uneven, flexible or deformable. This opens the space for idled interface properties that could compensate for the lack of visual interface affordance that smart objects often will have.Through user studies, we will explore how such a multisensory-illusion interaction-paradigm can be used to enhance everyday objects. We will first extract parameters that describe multisensory surface perception: such as the degree of pressure applied, the haptic sensation of fabric, etc. as well as the distortion induced when surfaces composed of soft, stretchable, and deformable materials are touched. We will then mimic the analog multimodal experience using multisensory illusion. Then, we will explore how to adjust the illusion affecting parameters to “program” the multisensory illusion. Finally, we will design, implement, and evaluate illusionary surface interaction for Buxton’s entire taxonomy of input devices.Through the combination of the complementary expertise of amplified cognition, ubicomp, and interaction design, we will create a scalable, unified, and experimentally validated model for useful tangibles and touch-based sensory illusion interfaces. We will provide a technological framework for their efficient implementation to enable effortless deployment in smart personal spaces and control rooms. This model will be grounded in the theory of multisensory integration and constitute a theory transfer from cognitive science towards a new interaction paradigm allowing for rich and scalable pervasive human-computer interaction.
DFG Programme
Priority Programmes