Worsley, M. & Blikstein P. (2010). Towards the development of learning analytics: student speech as an automatic and natural form of assessment. Paper Presented at the Annual Meeting of the American Education Research Association (AERA).
While many of the nation’s educators and leaders are calling for students to develop 21st century competencies through student-centered, hands-on learning, most school systems continue to cling to traditional forms of instruction. This reliance on traditional forms of instruction is not without merit, however, assessment within open-ended learning environments remains difficult, and, often times, seemingly unsatisfying. This is further complicated by the large emphasis placed on students demonstrating their knowledge through standardized tests. As a way of addressing this discontinuity between practice and theory, we have worked to develop Learning Analytics—a set of multi-modal sensory inputs, that can be used to predict, understand and quantify student learning. Central to the efficacy of Learning Analytics is the belief that educators will be able to more easily adhere to learning recommendations when they are given the proper tools; in this case, tools for more accurately assessing student knowledge in open-ended learning tasks. Accordingly, this study presents finding related to one of the Learning Analytics modalities: speech. By leveraging the tools of text and speech analysis, we are able to identify domain independent markers of expertise. Some of the most prominent markers of expertise include: user certainty, the ability to describe things efficiently and a disinclination to use unnecessary descriptors or qualifiers. While many of these are things one would expect of an expert, some of them are also observed among novices. To explain this we report on learning theories that can reconcile these seemingly odd findings, and expound on how these markers can be useful for identifying student knowledge learning over the course of an intervention or classroom experience.