Using advanced sensing and artificial intelligence technologies, we are investigating new ways to assess project-based activities, examining students’ speech, gestures, sketches, and artifacts in order to better characterize their learning over extended periods of time.
Politicians, educators, business leaders, and researchers are unanimous in stating that we need to redesign schools to teach “21st century skills”: creativity, innovation, critical thinking, problem solving, communication, and collaboration. None of those skills are easily measured using current assessment techniques, such as multiple choice tests or even portfolios. As a result, our schools are paralyzed by the push to teach new skills and the lack of reliable ways to assess those skills, or provide students with formative feedback. One of the difficulties is that current assessment instruments are based on products (an exam, a project, a portfolio), and not on processes (the actual cognitive and intellectual development while performing a learning activity), due to the intrinsic difficulties in capturing detailed process data for large numbers of students. However, new sensing and data mining technologies could make it possible to capture and analyze massive amounts of process data of classroom activities. We are conducting research on the use of biosensing, signal- and image-processing, text-mining, and machine learning to explore multimodal process-based student assessments.
Thus far, we have been able to show that using multimodal analysis techniques provide a powerful way to study complex learning environments.
In our ongoing research we are developing open-source tools for capturing and analyzing multimodal data in hands-on learning environments. We are also continuing to conduct experimental studies that can be use to examine the efficacy of different learning strategies and learning environments.