Paper Summary
Share...

Direct link:

Realizing the Potential of Automatically Scored Three-Dimensional Assessment

Sat, April 11, 3:45 to 5:15pm PDT (3:45 to 5:15pm PDT), Los Angeles Convention Center, Floor: Level Two, Room 515B

Abstract

Objectives
Advances in AI-enhanced assessments, such as automated scoring, have the potential to support teachers in interpreting and using assessment information to promote learning. This project aims to develop an automated analysis system within an online platform that analyses and displays performance results while providing instructional guidance to teachers. We focus on supporting teachers so that advances in AI can effectively bring assessment and instruction together to improve learning outcomes. Specifically, this study examines how teachers utilize the PASTA platform, including automatically generated student reports (AutoRs) and instructional strategies, to support student learning.

Perspective
Our conceptual framework for AI as a support for classroom-based assessment draws from theories of situated and distributed cognition (among others), which emphasize the interaction between human cognitive processes and external resources (Pea, 2004). In our framework, AI tools provide support in various forms to enhance teachers' capabilities, aiming to transforming teachers from an AI observer to an adopter, collaborator, and innovator (Zhai, 2024). We achieve this by:
o Developing AutoRs for science performance tasks that enable noticing, attending to, and interpreting student response information in instructionally useful ways.
o Developing next-step instructional strategies to improve teachers' use of student response data that comes from the assessment and enhance student learning (He et al., 2024).
o Iteratively studying the effectiveness of AutoRs and the instructional strategies to support teachers' decision-making.

Methods
We worked with six middle school science teachers and their students. The six teachers used two to four assessment tasks in their classrooms. Students responded to explanation tasks involving the properties of substances or chemical reactions, which required them to apply their knowledge. Based on collective student performance, teachers then selected one of the next-step instructional strategies provided by the system to implement. We observed their classes to see how the teachers used our instructional strategies and collected surveys on their experiences with the platform.

Results
Overall, our study revealed that the AutoR and instructional strategies support teachers' instructional decision-making during instruction as listed below.
AutoRs: Teachers found the AutoRs beneficial for understanding students' difficulties with learning goals. One teacher noted, "We had been reinforcing the ideas of claim, evidence, and reasoning all year. With the PASTA tasks, I noticed my students still struggle with reasoning..."
Holistic group report: Teachers responded that they found the holistic group report useful because it showed overall class performance, helping them understand their students' strengths and weaknesses regarding the learning goals.
Instructional strategies: Teachers found the provided strategies to be clear, appropriate for the grade level, and aligned with the Standards. Most teachers utilized hands-on activities or teacher demonstrations to help students visualize the concepts addressed in the tasks.

Scholarly Significance
Advances in AI are changing the boundaries for what assessment can look like in science classrooms. As these new assessment tools become more widely available, they hold promise for empowering teachers to use assessments in innovative and instructionally supportive ways. Our research and development work contributes to the evidence base on how AI can achieve that promise.

Authors