Project C05: Creating explanations in collaborative human–machine knowledge exploration
When it comes to decision-making in medicine, a great deal of information must be considered and evaluated. Both prior medical knowledge as well as the patient’s previous test results or health history are needed. Based on this information, doctors make a diagnosis and decide on what treatment the patient should receive. In Project C05, researchers in computer science at Bielefeld University are innovating an intelligent system that assists medical professionals in evaluating all of the available treatment options and potential consequences of these therapies. For this, the system does not just answer questions – it also asks active questions and gives recommendations. Through this interaction, medical professionals arrive at an informed treatment decision and an explanation of why the chosen therapy is the most appropriate compared to other possible options.
Research areas: Computer science
Support Staff
Lukas Kachel, Bielefeld University
Rakhi A S Nair, Bielefeld University
Marcel Nieveler, Bielefeld University
Daniel Prib, Bielefeld University
Publications
Formalizing cognitive biases in medical diagnostic reasoning
D. Battefeld, S. Kopp, in: Proceedings of the 8th Workshop on Formal and Cognitive Reasoning, 2022.
A Prototype of an Interactive Clinical Decision Support System with Counterfactual Explanations
F. Liedeker, P. Cimiano, in: 2023.
Dynamic Feature Selection in AI-based Diagnostic Decision Support for Epilepsy
F. Liedeker, P. Cimiano, in: 2023.
Revealing the Dynamics of Medical Diagnostic Reasoning as Step-by-Step Cognitive Process Trajectories
D. Battefeld, S. Mues, T. Wehner, P. House, C. Kellinghaus, J. Wellmer, S. Kopp, in: Proceedings of the 46th Annual Conference of the Cognitive Science Society, 2024.
A User Study Evaluating Argumentative Explanations in Diagnostic Decision Support
F. Liedeker, O. Sanchez-Graillet, M. Seidler, C. Brandt, J. Wellmer, P. Cimiano, in: n.d.
An Empirical Investigation of Users' Assessment of XAI Explanations: Identifying the Sweet-Spot of Explanation Complexity
F. Liedeker, C. Düsing, M. Nieveler, P. Cimiano, in: 2024.
ASCODI: An XAI-based interactive reasoning support system for justifiable medical diagnosing
D. Battefeld, F. Liedeker, P. Cimiano, S. Kopp, in: Proceedings of the 1st Workshop on Multimodal, Affective and Interactive EXplainable AI (MAI-XAI), 2024.
Show all publications