OpenGesture: a low-cost, easy-to-author application framework for collaborative, gesture-, and speech-based learning applications

Author: Worsley, M. & Blikstein, P.
Year: 2012
Project: Multimodal Learning Analytics
Type: Conference Presentation (with paper)
Conference/Journal: AERA 2012
Citation:

Worsley, M. & Blikstein P. (2012). OpenGesture: a low-cost, easy-to-author
application framework for collaborative, gesture-, and speech-based
learning applications. Paper presented at the Annual Meeting of the American Education Research
Association (AERA 2012).

Doi:
10.1145/1999030.1999075

Abstract

In this paper, we present an application framework for enabling the development of gesture and speech based applications for collaborative learning environments. More specifically, we are concerned with combining the affordances of natural interfaces with educational theories concerning embodied cognition to develop an application framework that enables education researchers and practitioners to create enriching multi-modal learning experiences. Furthermore, this paper highlights a user study that explores how students interacted in a multi-user, collaborative space. Our initial findings indicate that these applications are certainly feasible for well-defined learning tasks, but may require machine learning based training in order to be successful in contexts where the vocabulary is more diverse.