Project Details
Projekt Print View

TRR 318:  Constructing explainability

Subject Area Computer Science, Systems and Electrical Engineering
Humanities
Social and Behavioural Sciences
Term since 2021
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 438445824
 
In our digitized society, algorithmic approaches (such as machine learning) are rapidly increasing in complexity, making it difficult for citizens to understand their assistance and accept the decisions they suggest. In response to this societal challenge, research has started to push forward the idea that algorithms should be explainable or even able to explain their own output. This has led to vivid developments of systems providing explanations in an intelligent way (explainable AI or XAI). Although such AI systems have the potential to provide an explanation to humans, their interaction is severely limited because they build on the assumption that explanations are ‘delivered,’ that is, information is merely provided. Such a paradigm risks generating explanations that are not tailored to the receivers’ understanding—let alone their informational needs or the given context.With the interdisciplinary Transregional Collaborative Research Center (TRR), we are challenging this reductionist view on explanations and proposing a new paradigm of co-constructing an explanation that will contribute to novel ways of interacting with machines: Within this paradigm, humans play an active part in the explanation practices by co-shaping the goal and their process. Our approach promotes humans’ active and mindful participation in sociotechnical settings with AI technologies, thus increasing their informational sovereignty. Because such a paradigm change requires an interdisciplinary approach, our proposed TRR brings together linguists, psychologists, media studies researchers, sociologists, economists, and computer scientists who are firmly convinced that investigating the nature and inherent mechanisms of explainability and explanation requires putting human understanding at the center of attention and considering it as a product of a contextualized multimodal co-construction.In our approach, we will model the explanation process as an interaction that unfolds over time and advances the construction of the explanandum. Consequently, we will investigate the explanandum as a dynamic product of the interaction. This will extend current research in (computer) science and offer new answers to the aforementioned societal challenge by contributing to the development of: (i) a multidisciplinary understanding of the mechanisms involved in the process of explaining in a tight coupling with the process of understanding and the contextual factors that modulate both, (ii) computational models and complex AI systems that focus efficiently on what kind of explanation a person needs in a current context, and (iii) a theory of explanations as social practices and communicative acts that takes the roles and expectations of participants into account. Our research will lay the foundations for explainable and understandable AI systems that will empower citizens to be active and mindful participants in a digital society.
DFG Programme CRC/Transregios

Current projects

Applicant Institution Universität Paderborn
Co-Applicant Institution Universität Bielefeld
Participating University Ludwig-Maximilians-Universität München
 
 

Additional Information

Textvergrößerung und Kontrastanpassung