Project Details
PrivateEyeXR: Preserving Eye Privacy in Extended Reality, from Data Preprocessing to Model Development
Applicant
Professorin Dr. Enkelejda Kasneci
Subject Area
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term
since 2022
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 491966293
Extended reality (XR) technologies increasingly rely on eye tracking to enable immersive interactions, yet the sensitive nature of eye data, which is capable of revealing biometric, cognitive, and behavioral traits, demands robust privacy solutions. Existing methods, such as anonymization and basic differential privacy, fail to address the unique challenges of XR, including real-time data streams, temporal correlations in gaze behavior, and the need for scalable cross-device processing. This project proposes a comprehensive framework to preserve eye privacy across the XR data lifecycle. Building on foundational work in neural style transfer and federated learning, we aim to develop real-time feature decomposition techniques for dynamic eye data, enabling privacy-aware manipulation of iris and gaze patterns without compromising utility for tasks like gaze estimation. We further adapt federated and split learning algorithms to XR’s cross-device constraints, especially optimizing for data heterogeneity, partial client participation, and bandwidth limitations. Advanced privacy mechanisms, such as temporally aware differential privacy and Pufferfish frameworks, will be integrated to mitigate risks from sequential inference attacks. Validated through user studies, real-time XR deployments, and open-source tools, this work bridges the gap between privacy preservation and functional utility in eye tracking. By establishing scalable, ethical standards for eye data protection, our project will directly benefit secure XR applications in healthcare, education, and authentication, while advancing foundational research in privacy-preserving machine learning for multimodal sensory systems.
DFG Programme
Research Grants
