Co-constructing explanations with emotional alignment between AI-explainer and human explainee (Project A03)

In Project A03, researchers are integrating the element of emotion into the explanation process by considering issues such as how emotional reactions influence the perception of an artificial intelligence explanation. Which decisions should humans make with the help of artificial intelligence in risky situations? How can explanations be aligned with the emotional state of the human user? Researchers from the fields of computer science and economics are investigating how emotions such as happiness and anxiety impact processes of comprehension and decision-making. Their goal is to develop a model for intelligent machines that recognizes emotions and takes these into account in explanatory processes.


Schütze, C., Lammert, O., Richter, B., et al., (2023) Emotional Debiasing Explanations for Decisions in HCI. Artificial Intelligence in HCI. Lecture Notes in Computer Science(), vol 14050. S. 318-336.

Lebedeva, A., Kornowicz, J., Lammert, O., Papenkordt, J., (2023) The Role of Response Time for Algorithm Aversion in Fast and Slow Thinking Tasks. Artificial Intelligence in HCI. Lecture Notes in Computer Science(), vol 14050. S. 131-149. This publication was created in cooperation with Arbeitswelt.Plus (

In this video the project leaders present their view of co-construction.
Research areas

computer science, economics

Project leaders

Prof. Dr. Kirsten Thommes, Paderborn University

Prof. Dr.-Ing. Britta Wrede, Bielefeld University


Olesja Lammert, Paderborn University

Christian Schütze, Bielefeld University 

Associate member

Dr.-Ing. Birte Richter, Bielefeld University

Support Staff

Maryam Alizadeh, Bielefeld University

Vivien Mercedes Brückmann, Paderborn University

Robin Sjard von Collani, Bielefeld University

Eldan Huduti, Paderborn University

Ayla Luong, Bielefeld University

Tetyana Shevchuk, Paderborn University