Projects

A total of 22 project leads, supported by some 40 researchers at Bielefeld University and Paderborn University from a wide range of fields spanning from linguistics, psychology, and media studies to sociology, economics, philosophy and computer science are investigating the co-construction of explanations.

The areas of research included in TRR 318 are divided into three categories: (A) “Explaining”; “(B) “Social practice”; and (C) “Representing and computing explanations”. These three areas in turn are subdivided into interdisciplinary subprojects.

Project INF provides an overarching research structure; Project Ö facilitates public relations and outreach; Project Z addresses administrative, organizational, and financial matters; and the RTG Project provides a framework for educating doctoral and post-doctoral researchers.

Over­view

Area A - Explaining

AbbreviationName
A01Adaptive explanation generation
A02Monitoring the understanding of explanations
A03Co-constructing explanations with emotional alignment between AI-explainer and human explainee
A04Integrating the technical model into the partner model in explanations of digital artifacts
A05Contextualized and online parametrization of attention in human-robot explanatory dialog
A06Co-constructing social signs of understanding to adapt monitoring to diversity

 

Area B - Social practice

AbbreviationName
B01A dialog-based approach to explaining machine learning models
B03Exploring users, roles, and explanations in real-world contexts
B05Co-Constructing explainability with an interactively learning robot
B06Ethics and normativity of explainable artifical intelligence

 

Area C - Representing and computing explanations

AbbreviationName
C01Healthy distrust in explanations
C02Interactive learning of explainable, situation-adapted decision models
C03Interpretable machine learning: explaining change
C04Metaphors as an explanation tool
C05Creating explanations in collaborative human-machine knowledge exploration
C06Technically enabled explanation of speaker traits

 

Intersecting projects

AbbreviationName
ZCentral administrative project
RTGResearch training group
ÖQuestions about explainable technology
INFToward a framework for assessing explanation quality

 

Associated projects

TitleDescription
Development of symmetrical mental modelsIndependent research group at Paderborn University
Human-Centric Explainable AIIndependent research group at Bielefeld University

Area A - Explaining

A01

Adaptive explanation generation

Learn more

A02

Monitoring the understanding of explanations

Learn more

A03

Co-constructing explanations with emotional alignment between AI-explainer and human explainee

Learn more

A04

Integrating the technical model into the partner model in explanations of digital artifacts

Learn more

A05

Contextualized and online parametrization of attention in human–robot explanatory dialog

Learn more

A06

Co-Constructing social signs of understanding to adapt monitoring to diversity

Learn more
This is the icon of subproject A01
This is the icon of the subproject A02
This is the icon of subproject A03
This is the icon of subproject A04
This is the icon of subproject A05
This is the icon of project A06

Area B - Social practice

B01

A dialog-based approach to explaining machine learning models

Learn more

B03

Exploring users, roles, and explanations in real-world contexts

Learn more

B05

Co-constructing explainability with an interactively learning robot

Learn more

B06

Ethics and Normativity of Explainable Artificial Intelligence

Learn more
This is the icon of subproject B01
This is the project icon of subproject 03
This is the icon of subproject B05
[Translate to English:]

Area C - Representing and computing explanations

C01

Healthy distrust in explanations

Learn more

C02

Interactive learning of explainable, situation-adapted decision models

Learn more

C03

Interpretable machine learning: Explaining Change

Learn more

C04

Metaphors as an explanation tool

Learn more

C05

Creating explanations in collaborative human–machine knowledge exploration

Learn more

C06

Technically enabled explanation of speaker traits

Learn more
This is the icon of subproject C01
This is the icon of subproject C02
This is the icon of subproject C03
This is the icon of subproject C04
This is the icon of subproject C05
This is the icon of subproject C06

Intersecting projects

INF

Toward a framework for assessing explanation quality

Learn more

Ö

Questions about explainable technology

Learn more

RTG

Research Training Group

Learn more

Z

Central administrative project

Learn more
This is the icon of the subproject INF
This is the icon of subproject Ö
This is the icon of the research training group
[Translate to English:]