Projects

A total of 23 project leads, supported by some 40 researchers at Bielefeld University and Paderborn University from a wide range of fields spanning from linguistics, psychology, and media studies to sociology, economics, philosophy and computer science are investigating the co-construction of explanations.

The areas of research included in TRR 318 are divided into three categories: (A) “Explaining”; “(B) “Social practice”; and (C) “Representing and computing explanations”. These three areas in turn are subdivided into interdisciplinary subprojects.

Project INF provides an overarching research structure; Project Ö facilitates public relations and outreach; Project Z addresses administrative, organizational, and financial matters; and the RTG Project provides a framework for educating doctoral and post-doctoral researchers.

Over­view

Area A - Explaining

Abbreviation Name
A01 Adaptive explanation generation
A02 Monitoring the understanding of explanations
A03 Co-Constructing explanations with emotional alignment between AI-explainer and human explainee
A04 Integrating the technical model into the partner model in explanations of digital artifacts
A05 Contextualized and online parametrization of attention in human-robot explanatory dialog
A06 Co-Constructing social signs of understanding to adapt monitoring to diversity

 

Area B - Social practice

Abbreviation Name
B01 A dialog-based approach to explaining machine learning models
B03 Exploring users, roles and explanations in real-world contexts
B05 Co-Constructing explainability with an interactively learning robot
B06 Ethics and normativity of explainable artifical intelligence

 

Area C - Representing and computing explanations

Abbreviation Name
C01 Healthy distrust in explanations
C02 Interactive learning of explainable, situation-adapted decision models
C03 Interpretable machine learning: Explaining Change
C04 Metaphors as an explanation tool
C05 Creating explanations in collaborative human-machine knowledge exploration
C06 Technically enabled explanation of speaker traits

 

Intersecting projects

Abbreviation Name
Z Central adminitrative project
RTG Research training group
Ö Qusetions about explainable technology
INF Toward a framework for assessing explanation quality

 

Associated projects

Title Description
Development of symmetrical mental models Independent research group at Paderborn University
Human-Centric Explainable AI Independent research group at Bielefeld University

Area A - Explaining

A01

Adaptive explanation generation

Learn more

A02

Monitoring the understanding of explanations

Learn more

A03

Co-constructing explanations with emotional alignment between AI-explainer and human explainee

Learn more

A04

Integrating the technical model into the partner model in explanations of digital artifacts

Learn more

A05

Contextualized and online parametrization of attention in human–robot explanatory dialog

Learn more

A06

Co-Constructing social signs of understanding to adapt monitoring to diversity

Learn more
This is the icon of subproject A01
This is the icon of the subproject A02
This is the icon of subproject A03
This is the icon of subproject A04
This is the icon of subproject A05
This is the icon of project A06

Area B - Social practice

B01

A dialog-based approach to explaining machine learning models

Learn more

B03

Exploring users, roles, and explanations in real-world contexts

Learn more

B05

Co-constructing explainability with an interactively learning robot

Learn more

B06

Ethics and Normativity of Explainable Artificial Intelligence

Learn more
This is the icon of subproject B01
This is the project icon of subproject 03
This is the icon of subproject B05
[Translate to English:]

Area C - Representing and computing explanations

C01

Healthy distrust in explanations

Learn more

C02

Interactive learning of explainable, situation-adapted decision models

Learn more

C03

Interpretable machine learning: Explaining Change

Learn more

C04

Metaphors as an explanation tool

Learn more

C05

Creating explanations in collaborative human–machine knowledge exploration

Learn more

C06

Technically enabled explanation of speaker traits

Learn more
This is the icon of subproject C01
This is the icon of subproject C02
This is the icon of subproject C03
This is the icon of subproject C04
This is the icon of subproject C05
This is the icon of subproject C06

Intersecting projects

INF

Toward a framework for assessing explanation quality

Learn more

Ö

Questions about explainable technology

Learn more

RTG

Research Training Group

Learn more

Z

Central administrative project

Learn more
This is the icon of the subproject INF
This is the icon of subproject Ö
This is the icon of the research training group
[Translate to English:]