In Project B05, researchers from the field of computer science are exploring non-verbal explanations between humans and machines. A robot is tasked to learn an action by interacting with a human, such as a specific movement. Misunderstandings can arise during this process because human users often do not know how robots acquire skills – is the robot’s direction of gaze important, or are there other factors that influence machine learning?
Researchers on this project are investigating study participants’ perceptions of how a robot works and are developing visualizations that can be used to improve users' understanding of the robot. In addition to this, the researchers are analyzing how gender, age, and prior knowledge can impact interactions with the robot, and how explanatory strategies can change while interacting with the robot. The findings of this project will situate the concept of explainability in a social context.