Unraveling students’ interaction around a tangible interface using Multimodal Learning Analytics

Author: Schneider, B. & Blikstein, P.
Year: 2014
Type: Refereed Conference Paper/Poster/Demo (with Proceedings)
Conference/Journal: EDM 2014

Schneider, B., & Blikstein, P. (accepted). Unraveling Students’ Interaction Around a Tangible Interface using Multimodal Learning Analytics. In Proceedings of the 7th International Conference on Educational Data Mining, London, UK.


In this paper, we describe techniques to use multimodal learning analytics to analyze data collected around an interactive tangible learning environment. In a previous study [13], we designed and evaluated a Tangible User Interface (TUI) where dyads of students were asked to learn about the human hearing system by reconstructing it. In the current study, we present the analysis of the data collected in form of logs, both from students’ interaction with the tangible interface and as well as from their gestures, and we describe how we extracted meaningful predictors for students’ learning from those two datasets. First we show how Natural Language Processing (NLP) techniques can be used on the tangible interface logs to predict learning. Secondly, we explored how KinectTM data can inform “in-situ” interactions around a tabletop (i.e. using clustering algorithms to find prototypical body positions). Finally, we fed those features to a machine-learning classifier (Support Vector Machine) and split students in two groups after performing a median split on their learning scores. We found that we were able to predict students’ learning gains (i.e. being above or belong the median split) with very high accuracy. We discuss the implications of those results for analyzing data from rich, multimodal learning environments.