New re­search group on ex­plain­able ar­ti­fi­cial in­tel­li­gence and its im­pact on hu­man trust

Bielefeld University is funding a new independent research group on explainable artificial intelligence and and its impact towards over-relying on AI-assisted decision-making for three years. The group "Human-Centric Explainable AI" is led by Dr David Johnson, computer scientist at Bielefeld University, at CITEC and complements the work of TRR 318. With the "Independent Research Groups", Paderborn and Bielefeld Universities support outstanding young scientists on their career path.

The group is directly linked to the TRR focus on co-construction of explanations, by developing theory on how people process explanations from AI systems and by identifying which types of explanations work best for reducing an over-reliance on AI recommendations. Like the TRR, the group takes an interdisciplinary approach working at the intersection of computer science (XAI), psychology (mental health assessment) and human-machine interaction (human-centred XAI design).

Innovations in AI are leading the way in improving decision making in areas such as mental health assessment, Johnson explains: "However, AI solutions are not perfect and can be biased in their decision making. Therefore, practitioners using AI-assisted decision-making systems should know why an AI has recommended a certain diagnosis so that they can better judge the recommendation. This allows the practitioner to make an informed decision rather than just blindly trusting the AI."

In recent years, a field of research has emerged called "explainable AI" (XAI), which attempts to make AI decisions more transparent for end users by providing explanations for these decisions. The methods for generating explanations have been a major focus of previous research. As a result, Dr. Johnson is now focusing on the actual effectiveness of these explanations in real-life situations, for example in decision-making, as an important field of research: "The research in my group will aim to address this by putting human knowledge, needs, and understanding first in the design of XAI-based systems."

The group's goals are to enhance the design of XAI-based tools and interfaces by including users in the design process and conducting extensive evaluations to better understand how humans interact with explanations. Also, to find out which methods are most effective for improving AI-supported decision-making. This knowledge will be used to design new interactive AI-supported decision-making systems for real-world problems. A collaboration with the “Multimodal Behavior Processing” working group led by Professor Dr. Hanna Drimalla at CITEC in Bielefeld is also planned.

Dr. David Johnson is a research associate and associate member of project A06 “Co-constructing social signs of understanding to adapt monitoring to diversity".

Further Information:

This is a portrait foto of Dr. David Johnson.
Dr. David Johnson is the leader of the new research group.