Published: Jan. 23, 2023

The foundational question for Strand 2 is: What advances in theories, interaction-paradigms, and frameworks are needed to orchestrate effective student and teacher interactions with AI partners? To address this, Strand 2 has narrowed their focus toward three main themes: (1) Enacting Framework and Measurement Methods of Student-AI Teaming, (2) Collaborative Learning (non-verbal and verbal communication; peer scaffolding), and (3) Iterative Design and Evaluation of the AI Partner Interface via Human-Centered Design and HCI-Focused Empirical Studies. These themes will help our team science experts in Strand 2 better understand how students, AI, and teachers can collaborate effectively in both classrooms and remote learning contexts.

Measuring Collaborative Problem Solving

Strand 2’s Enacting Framework and Measurement Methods theme focuses on the need to: successfully identify the basis of collaborative problem solving skills in social, affective, and cognitive processes, identify features of successful collabora- tion, identify how to promote equitable and trusted interactions in team problem solving; and lastly, how to measure all of these. The underlying emphasis of this work includes the data-driven development of automatic measures of conversational influence, trust and equitable influence, and complementary theory development, which focuses on the theoretical fundamentals of collaborative problem solving including how communication is coded for problem solving skills, taking the student’s perspective on collaborative problem solving, and the dynamics of collaboration where skills are expressed over time and multiple student interactions. Team members have been working on ways to tie dynamic speech flow mea- sures based on recurrence analysis and mutual information to human- and automatic-coded collaborative problem solving skills and facets. Strand 2 is also working to organize an iSAT measurement workshop hosted at Arizona State University early next year in which multiple iSAT and ASU researchers and students will have the opportunity to analyze iSAT data using their methods, come together to share their findings, and work as teams to address existing iSAT research questions and questions that emerge from the workshop.

Non-verbal Collaboration

Related to the theme of Collaborative Learning, a significant goal for Strand 2 this quarter has been to improve understanding of non-verbal aspects of group collaboration such as eye gaze or gesture to better understand and support students’ collaborative engagement. Within this larger goal, Strand 2 team members aim to (1) identify types of non-verbal communication in groups, (2) generate coding schemes and rubrics for non-verbal communication, and (3) annotate coding videos based on the rubrics. To achieve this, team members generated a coding scheme that identifies various aspects of non-verbal behaviors including eye gaze (e.g., joint attention, looking at tools/computer, etc.), gestures and body language (e.g., pointing, leaning, manipulating objects, nodding), and emotion (e.g., smile, frown). They then applied this coding scheme to an initial set of 12, 5-minute segments of historical video data collected at the University of Wisconsin. Next, they refined the coding scheme and trained two undergraduate students to annotate videos. Following this, the team started annotating iSAT classroom videos and found that the coding scheme works across different sets of data and contexts, indicating that it is robust in identifying non-verbal communication during collaborative learning in groups. Strand 2 has watched an initial set of 15 iSAT classroom videos and annotated two segments. They also have made several recommendations for capturing videos in future classroom implementations based on their analysis. In the coming months, Strand 2 plans to code more iSAT classroom videos and lab videos, manually annotate the videos to coordinate with the automated annotations, and discuss ways to describe meanings of gestures from automated annotations.

iSAT Lab Data Collection

One of the major goals as part of the Strand 2’s Iterative Design and Evaluation of the AI Partner Interface theme has been collecting data on collaboration to better understand how data can be used to model collaborative behaviors and correspondingly implement these models as part of our future AI Partner. The data is being collected by the iSAT Lab, which has implemented an initial study this fall consisting of 30 groups total (15 groups of 2 and 15 groups of 3 participants) engaging in five collaborative tasks: Wason Card task, Weights task, Sphero Group Programming task, MakeCode Debugging task, and Board Game task. The team collected approximately 2.5 hours of data for each group in multiple forms including Kinect video, regular video and audio, screen recording, eye tracking with Tobii glasses, and state and trait surveys including affective, social, and cognitive states as well personality and leadership traits.

Eye-tracking lab

iSAT Lab participants examine collaborative problem-solving using an eye-tracker

The next steps involve collaborating with ASU to dive deeply into analyzing the data. Strand 2 is in the process of organizing the files that will include all transcript data, survey and outcome data, and eye movement data. These files will be the critical groundwork for a data investigation at ASU, where students will apply different approaches to analyzing the data given research questions that are of interest to them, as well as available to the larger iSAT team to support their research. The team will also begin implementing new experiments in the lab that will be focused on explicitly testing initial versions of AI partners with underlying models based on this early iSAT lab data.