3rd TRR 318 Con­fe­rence: Con­tex­tu­a­li­zing Ex­pla­na­ti­ons (Con­tEx25)

As AI systems are used more and more in high-stakes domains, it also becomes ever-more important to make AI systems transparent to ensure meaningful human control and empower human users to contest or override AI-based decisions. Without sufficient transparency, increasingly complex and autonomous AI systems may leave users feeling overwhelmed and out of control, which is legally and ethically unacceptable, especially in the context of high-stakes decisions. For the users to feel empowered rather than out of control, explanations need to be relevant, providing sufficient information on which basis an output can be contested or challenged.

It has been increasingly noted by the XAI community that no one explanation can fit all needs. Further, recent approaches have advocated for a more participative approach to XAI in which users are not only involved but can directly shape and guide the explanations given by a certain AI System.

The 3rd TRR 318 Conference: "Contextualizing Explanations" is an international and interdisciplinary conference focusing on the question how explanations can be contextualized to increase their relevance and empower users.

Key research questions that we want to explore during the conference include:

1. How do contextual variables influence the effectiveness of explanations?

2. What are the relevant context factors to be taken into account in adapting an explanation to specific domains, users, or situations?

3. How can context be represented algorithmically to support contextual adaptation of XAI explanations?

4. What new architectures or approaches in XAI support the dynamic adaptation of explanations with respect to changing user needs?

5. How can user modelling support a more personalized explanation process?

6. In which ways can the dynamics of context be modelled?

7. What are actual examples or use cases of explanation processes and how does context influence the explanation process?

8. How can the suitability of contextually adapted explanations be studied / validated / evaluated?

9. Which explanation processes are particularly suitable for which context?

10. Which context-specific outcomes are influenced by explanations?

11. How can XAI empower users across diverse contexts to make informed decisions and effectively interact with AI systems?

12. What constitutes a useful taxonomy for categorizing contexts in which explanations are provided?

13. What are the various contexts in which explanations are provided and utilized?

Call for Pa­pers

The 3rd TRR318 Conference "Contextualizing Explanations" invites contributions from a wide range of disciplines (computational but also human / social science) seeking to contribute to advancing research on how explanations can be contextually adapted.

We invite interested participants to submit a two page abstract via Easychair:  https://easychair.org/conferences/directory?a=33811429

The abstracts will be peer-reviewed and appear as Proceedings published by Bielefeld University Press.

Im­port­ant Dates

Deadline for Submissions: March 31st

Reviewing Period: March 31st to April 25h

Notification of Acceptance: April 30th

Conference: 17th and 18th of June, Bielefeld

In­vited Speak­ers

Angelo Cangelosi (University of Manchester)

Virginia Dignum (Umeå University)

Kacper Sokol (ETH Zurich)

Organizing Comittee

Prof. Dr. Philipp Cimiano

More about the person

Prof. Benjamin Paaßen

More about the person

Prof. Dr.-Ing. Anna-Lisa Vollmer

More about the person

General questions go to conference@trr318.uni-paderborn.de,

media enquiries to communication@trr318.uni-paderborn.de.

Links