Project Details
Ethics and normativity of explainable AI (B06#)
Subject Area
Practical Philosophy
Term
since 2023
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 438445824
Research on explaining and explainability needs ethical reflection, because explanations can be used to manipulate users or to create acceptance for a technology that is ethically or legally unacceptable. Moreover, designing explainability to meet ethical demands (e.g., justifiability, accountability, autonomy) must not necessarily be in line with meeting users’ interests. The ethical reflection proposed in our project encompasses three intertwined lines of investigation: First, we shall systematically classify the different purposes, needs, and requirements of explanations. Second, our project will also reflect ethically on technological development within TRR 318. Third, we shall combine both lines of research to pursue both a theoretical and a practical goal: On the one hand, we want to extend the TRR 318’s model of explaining as a social practice into an ethical framework of explaining AI. On the other hand, we shall apply this framework to concrete projects within TRR 318 to (a) explicate how current design choices reflect ethical considerations and users’ demands by following the methodological steps of value sensitive design (VSD). From these insights, we shall then (b) formulate concrete design recommendations to inform further development within TRR 318. In the long run, B06 will also enhance the TRR’s consideration of the social contexts, because it identifies issues that cannot be fixed technically but need to be addressed on a social or legal level.
DFG Programme
CRC/Transregios
Subproject of
TRR 318:
Constructing explainability
Applicant Institution
Universität Paderborn
Project Heads
Professorin Dr. Suzana Alpsancar; Professor Dr. Tobias Matzner