Published: Sept. 22, 2022

Socio-collaborative learning is key to students’ development of deep conceptual understand- ings, but it can be challenging for educators to monitor and orchestrate the collaboration occurring within each group in the classroom. iSAT aims to expand the scope of what can be sensed in the classroom to provide actionable information to both teachers and students. An institute-wide team designed an early version of an interactive display of collaborative learn- ing metrics based on automated analysis of students’ speech in the classroom. We implemented a technical architecture designed for real-time data collection and analysis of small group collaborations, including audio, video, and activity logs. The system allows for flexible incorporating of different AI-based collaborative analy- sis components as well as displays for different stake- holders (e.g., researchers, students, teachers).

iSAT Technical Infrastructure

Version one of the iSAT Technical Infrastructure

In this version, a team’s audio/video data is captured through a custom-designed web-based application and streamed to a cloud-based secure data repository in short 10-second chunks. The audio data is processed through an AI pipeline to automatically characterize and display the collaborative processes exhibited by the teams. Here, the audio is processed through a Classroom Activity Perception Engine, which includes Google’s automatic speech recognition engine as well as custom- ized speech diarization models. The resulting diarized transcript is processed by the Intelligent Inference Engine, which computes speech, conversation, content, and collaboration metrics, which are presented through the Visualization Engine in the form of an interactive display.

Visualizations in the Main Display tab are organized across five categories including: Acoustic (e.g., signal to noise ratio), Conversation (e.g., verbosity or the number of words spoken by a team over time), Content (e.g., what students are saying), Task/Topic (e.g., degree of task and topic related talk in the student conversation), and Collaboration (e.g., classifiers for the degree that students are engaging in shared knowledge construc- tion, negotiating and coordinating, and maintaining a positive team dynamic). The initial versions of the iSAT technical infrastructure and corresponding display were designed for noisy classrooms with multiple teams of middle school students collaboratively learning. Cur- rently, the team is designing interactive displays for teachers and students through participatory design pro- cesses and with a lens of Responsible Innovation, which attends to ethical questions and responsiveness to the needs of teachers and students. A demonstration video of the early version of this v1. AI Partner won the best Interactive Event Award at the Artificial Intelligence in Education Conference.

Intelligent Inference Engine Display

The Intelligent Inference Engine computes speech, conversation, content, and collaboration metrics all displayed here

The team aims to conduct user tests with student- and teacher- facing interactive displays in the Fall. This is an important step towards iSAT’s goal of using AI to make classrooms more effective, engaging and equitable learning environments by helping teachers and students harness the power of collaborative learning.