Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
Bluesky
Threads
X (Twitter)
YouTube
Objectives
In this paper, we will discuss our collaborative work with teachers to create a design-based physics unit on roller coasters that enabled students to write scientific explanations and get automated feedback from PyrEval (Gao et al., 2018), an NLP technology. We integrated feedback from teachers throughout the process of developing the automated assessment and the curriculum design, which were developed to support students in understanding and writing explanations of important physics ideas and relationships. After each implementation, we solicited teachers’ feedback about what worked and what could be improved.
Methods
Initially, our teachers provided their thoughts on which science ideas were important for students to learn and include in their writing. In our meetings with teachers, we discussed examples of students’ writing and teachers’ current practices to help them explore the importance of having students write explanations in science and their role in supporting this process. We gave an overview of PyrEval and how it works including relating it to our teachers’ current practices. We also discussed the feedback that students would receive on their written explanations and designed feedback as text statements, in consultation with the teachers.
Thereafter, in a follow-up meeting, the teachers provided feedback on how the automated assessment worked in their classrooms, what problems they encountered, and the issues that students had in interpreting the feedback. Based on their input, we changed the text-based feedback into a list of ideas that students either mentioned or missed in their essays. More importantly, in this meeting, we also helped teachers understand how factors like sentence length, punctuation, and vague explanations of science ideas could affect PyrEval’s accuracy (Puntambekar et al., 2024). We also worked with teachers to include participatory structures and routines in the classroom to better support students’ revisions based on the automated assessments. These included peer review and reflection of feedback in groups, and teacher-led whole-class discussions to help students understand how to address the automated feedback in their revised essays and write better explanations in science.
Results
Throughout our interactions with teachers, we emphasized that we could not accomplish the goals of the project without their input and fostered a collaborative environment. Our meetings were framed as co-design (Penuel et al., 2007) meetings and were highly teacher-centered. We focused on uncovering teachers’ thoughts about teaching science and their classroom practices. We then built upon and extended how they were already supporting students’ science learning in new ways, by discussing how automated assessments could help their students and their teaching.
Further, we also helped teachers understand the benefits of the automated assessments as well as its limitations. We wanted them to understand the kind of feedback students might receive and why, and that the NLP system may be inaccurate at times. This enabled the teachers to support students’ use of the feedback to make targeted, substantive and meaningful revisions.
Significance
Our work with teachers has enabled us to implement AI in classrooms that works synergistically with the support that teachers and peers can provide to complement the automated feedback.