Project Details
Gaze-Assisted Scalable Interaction in Pervasive Classrooms
Subject Area
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
General, Cognitive and Mathematical Psychology
General, Cognitive and Mathematical Psychology
Term
from 2020 to 2024
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 425867974
Many predictions regarding the shift to interact with pervasive environments, rather than dedicated single devices became true. However, we are still far away from seamless interactions between users and the pervasive computing environments, we already live in. One of the key issues is that human-computer interaction is still very often seen as the interaction between one user and one device instead of the coherent collaborative multi-modal interaction of many users with many personal interaction devices and pervasive computing environments. Recent advances in mobile and pervasive eye tracking allow nowadays the integration of information that can be deduced from specific eye behaviour, such as potential intentions of the users, conscious awareness of individual interactive elements and current contextual situation of the users. Considering the potential of this technology, it is therefore one of the key aims of the project to design, develop and assess novel gaze-assisted scalable interaction paradigms for pervasive computing environments. A prominent example for pervasive environments are classroom and learning environments that provide huge potential for collaborative interaction and learning, which is not yet exploited due to very limited cross-device interaction and cross-user collaboration support. Due to this potential, we choose public classroom environments as reference scenario for the proposed project, as it provides an excellent test-bed for gaze-assisted scalable interaction. We will focus on several challenges inherent in classroom settings, in particular how multiple devices and how multiple users in various roles affect the interaction. We plan to build a “live class classroom” that enables us to assess user behavior in an unconstrained real-life scenario. We will address the challenges from two perspectives: (1) How can we define specific indicators from eye measures that allow to assess users' mental states during the interaction with several devices and users and (2) How can we design effective, efficient and satisfying scalable gaze-based interaction techniques. We will include two long-term studies, one in the beginning to assess unconstrained user behaviour when interacting with currently available interaction devices and the second one at the end of the project to evaluate interaction techniques that have been developed during the project. The findings obtained with different methods will be integrated into one coherent assessment of scalable interaction paradigms, stemming from behavioural and design science research conducted within the project. The integration will assure for a common understanding of the results obtained with various methods in different disciplines and how their advancement within the project could be used for further research.
DFG Programme
Priority Programmes