Lecture on missing interactivity in Explainable AI by Kary Främling

On December 14, Prof. Dr. Kary Främling from the Swedish Umeå University will be a guest at TRR 318. The professor of Data Science with a focus on emphasis in data analysis and machine learning conducts in-depth research on explainable AI. Additionally, he is a member of the advisory board of the TRR and thus supports the projects A03 and C02. During the Activity Afternoon, Främling will give a lecture on "lack of interactivity in explainable AI" and will also accompany the subsequent discussion. The event will be held online on Zoom and is open to everyone interested.

Details of the lecture:

"Why is current Explainable AI not interactive and what can we do about it?"

by Kary Främling

Abstract:
Current state-of-the-art Explainable AI (XAI) methods tend to produce a static "explanation" that shows how influential different features were for the outcome of an AI model, without offering any means of interaction to the user. One reason for that is that those methods are not capable of providing explanations in more than one way. The Contextual Importance and Utility (CIU) method differs from those methods by natively providing counterfactual "what-if" explanations in addition to the typical feature influence explanations. More importantly, CIU's so called intermediate concepts give the possibility to provide explanations on different levels of abstraction and different vocabularies, depending on the user. This flexibility doesn't automatically make explanations interactive but it does open true possibilities for creating interaction and adapting explanations to the needs of individual users.

Date: 14.12.2022, 2:30 pm
Zoom: Link to the lecture

[Translate to English:]
[Translate to English:]