An Interdisciplinary Take on Explainable Artificial Intelligence

Scientists from the “Constructing Explainability” Collaborative Research Center (CRC/TRR 318), a joint research initiative by Bielefeld University and Paderborn University, are presenting new approaches and findings in a special issue of the prestigious journal “Artificial Intelligence” from Springer. Professor Dr.-Ing. Britta Wrede, a computer scientist at TRR 318, has published the issue together with fellow computer scientist Professor Dr. Ute Schmid. The special issue focuses on the field of “explainable artificial intelligence” (XAI).

“Research on explainable artificial intelligence deals with approaches that empower humans to understand – and possibly even direct – machine learning and other artificial intelligence systems,” says Dr.-Ing. Britta Wrede, a professor of medical assistive systems at Bielefeld University who also heads subprojects A03, A05 and Ö at TRR 318. “More recent work in this field has focused on the process of explanation – for example, how explanations can be created by humans together with AI. At TRR 318, we are investigating this co-construction of explanations. In this special issue, we feature these approaches and bring other interdisciplinary perspectives to bear on the topic of XAI.”

Dr. Wrede and other researchers from TRR 318 contributed the following pieces for the special issue:

What is still missing for XAI to work?

The article by Schmid and Wrede in this special issue considers different disciplinary perspectives on AI. In their overview article "What is missing in XAI so far?", the co-authors bring together interdisciplinary perspectives on explaining and understanding. The article also identifies what unresolved questions remain in the field of XAI, including how dialogue-based explanations can be adapted for information that is specifically needed, or what role interaction plays in modeling explanations. Read article

AI Explanations in Real Time

The article “Agnostic Explanation of Model Change based on Feature Importance” was written by Maximilian Muschalik, Fabian Fumagalli, Professor Dr. Barbara Hammer, and Professor Dr. Eyke Hüllermeier. The computer scientists address explainable AI systems in the context of online learning in dynamic environments. “Previously, XAI methods were mainly used in static environments where models do not change over time and cannot be extensively analyzed and explained,” says Maximillian Muschalik. “In dynamic environments, however, data streams are constantly changing, and the systems have to adjust to this. Explanations therefore must be updated, too.” In the article, researchers from Ludwig Maximilian University of Munich and Bielefeld University present a model-agnostic approach in which it is not the entire model that is explained again, but only the significant differences after changes have been made to said model. Read article

Who does what during an explanation?

Linguists Josephine B. Fisher and Vivien Lohmer published a research report entitled “Exploring monological and dialogical phases in naturally occurring explanations” under the direction of Prof. Dr. Katharina J. Rohlfing and Prof. Dr. Friederike Kern of Paderborn University and Bielefeld University. This report was created in cooperation with medical doctors Prof. Dr. med. Winfried Barthlen and Dr. med. Sebastian Gaus of the Bethel Clinic. Together, the team investigated the structures of natural explanations in real-life medical contexts. The focus here was on the different phases of an explanation and the roles adopted by participants during the process of explanation. “By identifying the monological and dialogical phases of explanations, it has become apparent that there is a pattern in the respective roles. For example, those who are having something explained to them take an active role by initiating dialogical phases,” explains Fisher, who is the lead author on the report. “This has implications for AI systems because they should also be able to facilitate similar phases and actions.” Read article

Interview with TRR 318 Spokespersons

The special issue also includes an interview with Prof. Dr. Katharina Rohlfing, the spokesperson of TRR 318, and Prof. Dr. Philipp Cimiano, who serves as deputy spokesperson. The scientists discussed the research activities at TRR 318, including the goals of the Transregional Collaborative Research Center by Paderborn University and Bielefeld University, as well as which academic disciplines are represented. They also present new scientific concepts on explanations that were developed by the team at TRR 318. Full Interview

The scientific journal “KI – Künstliche Intelligenz" [“AI – Artificial Intelligence”] is published by Springer and is the official journal of "Gesellschaft für Informatik e.V." [the German Informatics Society]. The articles published in this journal are available to access free of charge.

Prof. Dr.-Ing. Britta Wrede, project leader of projects A03, A05 and Ö and co-editor of the special issue