Project Details
SPACE-EYE: Spatial and Architectural Evidence from Movement-aware Eye-tracking
Applicant
Professor Dr. Jakub Krukar
Subject Area
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term
since 2025
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 570734475
The proposed project SPACE-EYE investigates how movement through space shapes visual attention in architectural contexts. Current models of human-building interaction either analyze visual attention independently of movement or simplify movement by assuming a fixed field of view aligned with walking direction. This artificial separation results in trivial insights that are often ignored by designers. The project integrates theories of visual attention (e.g., visual saliency, perception-action loop, and embodied attention) with movement-based data to develop a computational framework that jointly models gaze behavior and locomotion. The central research question is: What is the impact of movement through space on visual attention measures used in human-building interaction modelling? To answer this, we propose four work packages: WP1 develops methods to measure whether and how movement influences attention allocation. This includes identifying relevant movement and gaze metrics, creating a synchronization pipeline for eye-tracking and motion data, and validating it on existing datasets. WP2 identifies when movement impacts attention by systematically manipulating approach trajectories and target salience (visual and structural) in virtual reality scenarios. Sixty participants will experience varied architectural settings (e.g., airport, shopping mall) and their gaze behavior will be analyzed using Bayesian models to determine the explanatory power of competing theories. WP3 tests how movement influences attention in real-world high-impact scenarios using mobile eye-tracking and motion capture in a reconfigurable lab. This enables validation of virtual findings, assessment of inter-individual differences, and the use of unsupervised machine learning (e.g., UMAP) to detect patterns in high-dimensional movement data. WP4 disseminates open datasets and software (FAIR, FAIR4RS), organizes a cross-disciplinary workshop, engages the public via a museum event, and cross-validates methods with an external research group. The project’s impact lies in bridging two distinct domains: theories of movement and theories of visual attention. By modelling their interaction, we offer a new methodological foundation for predicting gaze behavior in dynamic, real-world environments. This is especially relevant for human-building interaction modelling, spatial HCI, geoinformatics, and architectural design. We address a known limitation in architectural simulation algorithms—namely their lack of embodiment—by providing models grounded in actual bodily movement. Open science principles ensure the reusability and extensibility of our tools. This project will strengthen the field of architectural cognition and establish the SPARC lab at a unique interdisciplinary position.
DFG Programme
Research Grants
