Co-constructing explanations with emotional alignment between AI-explainer and human explainee (Project A03)
In Project A03, researchers are integrating the element of emotion into the explanation process by considering issues such as how emotional reactions influence the perception of an artificial intelligence explanation. Which decisions should humans make with the help of artificial intelligence in risky situations? How can explanations be aligned with the emotional state of the human user? Researchers from the fields of computer science and economics are investigating how emotions such as happiness and anxiety impact processes of comprehension and decision-making. Their goal is to develop a model for intelligent machines that recognizes emotions and takes these into account in explanatory processes.
Schütze, C., Lammert, O., Richter, B., et al., (2023) Emotional Debiasing Explanations for Decisions in HCI. Artificial Intelligence in HCI. Lecture Notes in Computer Science(), vol 14050. S. 318-336. https://doi.org/10.1007/978-3-031-35891-3_20
Lebedeva, A., Kornowicz, J., Lammert, O., Papenkordt, J., (2023) The Role of Response Time for Algorithm Aversion in Fast and Slow Thinking Tasks. Artificial Intelligence in HCI. Lecture Notes in Computer Science(), vol 14050. S. 131-149. https://doi.org/10.1007/978-3-031-35891-3_9. This publication was created in cooperation with Arbeitswelt.Plus (https://arbeitswelt.plus/).
In this video the project leaders present their view of co-construction.