Developing Explanations Together
How can humans make sense of decisions made by machines? What do algorithmic approaches tell us? How can artificial intelligence (AI) become something that is understandable?
It is frequently the case that technical explanations require prior knowledge about how AI works and are difficult to follow. In the Transregional Collaborative Research Centre “Constructing Explainability” (TRR 318), researchers are exploring how to integrate users in explanatory processes.
The interdisciplinary research team is approaching this topic from two angles: first by understanding the mechanisms, principles, and social practices behind explanations, and second, by considering how this can be designed into artificial intelligence systems. The goal of the project is to make explanatory processes more intelligible and to create easily understandable assistive systems.
A total of 22 project leads, supported by some 40 researchers at Bielefeld University and Paderborn University from a wide range of fields spanning from linguistics, psychology, and media studies to sociology, economics, and computer science are investigating the co-construction of explanations.