Hand Gestures, Nods and Co. - Researchers Document Multimodal Signals in Board Game Explanations

The researchers of subproject A02 "Monitoring the understanding of explanations" have recorded more than 150 hours of audio and video material of explanatory situations and captured it in a corpus. The so-called Mundex corpus was presented at the first International Symposium on Multimodal Communication in Barcelona.

"With the corpus, we have created an extensive collection of explanatory dialogs. Now we need to linguistically code the individual dialogs and classify the multimodal signals that occurred, such as head nods or hand gestures," says linguist Stefan Lazarov from project A02. "This will allow us to investigate the relationship between these signals and the associated explanation fragments in the next step. With the results of our study, we contribute to a better understanding of comprehension processes in explanations."

To obtain the material, the researchers as well as student collaborators of subproject A02 conducted a video study in which two participants were asked to explain a board game to each other. The participants recorded a total of 87 of these interactions in the recording studio of the Faculty of Linguistics and Literary Studies at Bielefeld University. In each interaction, one speaker first explained the rules of the board game to another person, and then they played it together. Afterwards, they both looked at the actions and reactions of the person to whom the game was explained and commented on them.

The explanatory situations were recorded in the studio from several perspectives with cameras in order to visually capture as many nonverbal communication signals as possible. Microphones enabled documentation on a linguistic level. Now, the participants perform a semi-automatic transcription of the dialogue segments and annotate gestures and glances, acoustic information such as pitch, and the discourse of individual segments. The results are recorded in the Mundex corpus.

Mundex stands for "multimodal understanding of explanations". It thus provides data that can be used in the future for modeling comprehension processes in explanatory situations. The goal is to use the models for the design of explainable artificial intelligence (AI) and thus improve human-machine communication. At the same time, the corpus contributes to a better understanding of how humans adapt their explanatory strategies in interactions.

Further Information

What influence do gestures have when explaining board game rules? The explanation situations were recorded on video.
Stefan Lazarov, researcher in project A02