Project A04: Integrating the technical model into the partner model in explanations of digital artifacts

Project A04 investigates the different perspectives on the contents of explanations, i.e. what an explanation is about, and how they may change in the course of an explanatory interactive dialogue. An explanation about a technical artifact (which might be a hammer as well as a digital game) can encompass two different perspectives: On the one hand, explainers can describe the (im)material properties of the artifact, i.e., its architecture; on the other hand, explainers may provide reasons for why something is the way it is, and what goals can be pursued with it; thus describing its relevance. With regard to the project’s objective, we expect the perspectives chosen by the explainer and their combination during the explanatory process to depend on the respective interlocutors, their expertise and interests. Researchers from psychology, linguistics, and the didactics of computer science analyze dialogical and interactive explanations of  a board game at first and of digital artifacts later. The explanations  are studied in relation to the interlocutors’ conceptions of the explanation’s subject, and how it may change over the course of an  interaction. Based on these findings, the goal is to derive a dynamic, conceptualisation of explaining digital artifacts.


Research areas: Psychology, Linguistics, Computer science education

Project leaders

­­­­­­Prof. Dr. Heike M. Buhl, Paderborn University

Prof. Dr. Friederike Kern, Bielefeld University

Prof. Dr. Carsten Schulte, Paderborn University


Vivien Lohmer, Bielefeld University

Michael Schaffer, Paderborn University

Lutz Terfloth, Paderborn University

Support staff

Lars Hoferichter, Paderborn University

Rieke Roxanne Mülfarth, Paderborn University

Claire Roberts, Paderborn University

Svenja Schulte, Paderborn University

Alina Yudakov, Paderborn University


Fisher, J.B., Lohmer, V., Kern, F. et al. (2022) Exploring Monological and Dialogical Phases in Naturally Occurring Explanations. Künstl Intell.

Terfloth, L., Schaffer, M., Buhl, H.M., Schulte, C., (2023) Adding Why to What? Analyses of an Everyday Explanation.