MAI - XAI Multimodal, Affective and Interactive eXplainable AI
Multimodal Co-Construction of Explanations with XAI Workshop | Santiago de Compostela | October 19-20 2024
An important issue in eXplainable Artificial Intelligence (XAI) is how to make the interaction with an explainee more engaging and how to allow explainees to more actively shape the process of explanation to ensure that the explanation is adapted and tailored to their needs.
An important hypothesis of the TRR „Constructing Explainability“ is that this interaction should be as „natural“ as possible, relying on multiple modalities as humans also use in their day-to-day communication.
Towards designing systems that can develop an engaging and natural explanatory interaction with users, members of the TRR (Philipp Cimiano, Eyke Hüllermeier) are co-organising a workshop on multimodal, affective and interactive XAI (MAI-XAI) as part of this year’s edition of the European Conference on Artificial Intelligence (ECAI) in Santiago de Compostela, Spain.
The workshop is made up of three tracks:
Multimodal XAI:
Multimodal XAI is concerned with building and validating multi-modal resources that contribute to the generation and evaluation of effective multi-modal explanations. Case studies in real-world applications where XAI has been applied, emphasizing the benefits and challenges.
Affective XAI:
Affective XAI concerns challenges, opportunities and solutions for applying explainable machine learning algorithms in affective computing (also known as artificial emotional intelligence), and refers to machine systems that sense and recognize emotions.
Interactive XAI:
Interactive XAI poses the question of how to achieve, improve and measure users’ understanding and ability to operate effectively at the center of the XAI process as a basis to dynamically and interactively adapt the explanation to users’ needs and level of understanding.
Concerning the question of how to make XAI systems more co-constructive, which is a core topic for our TRR, the workshop invites to discussing the following key questions:
• Dialogue-based approaches to XAI
• Use of multiple modalities in XAI systems
• Approaches to dynamically adapt explainability in interaction with a user
• XAI approaches that use a model of the partner to adapt explanations
• Methods to measure and evaluate the understanding of the users of a model
• Methods to measure and evaluate the ability to use models effectively in downstream tasks
• Interactive methods by which a system and a user can negotiate what is to be explained
• Modelling the social functions and aspects of an explanation
• Methods to identify a user’s information and explainability needs
The workshop is overall organized by Bielefeld University, University of Bristol, University of Santiago de Compostela, University Jaume I, University of the Basque Country, Ludwig-Maximilian-University Munich, University of Queensland, ETH Zurich, Neapolis University Pafos, Universitatea Petrol-Gaze din Ploiești and the Polytechnic University of Bucharest. Further information can be found here.