Paper Summary
Share...

Direct link:

Supporting Students’ Argumentative Writing via Formative Automated Writing and Revision Assessment

Sun, April 14, 9:35 to 11:05am, Pennsylvania Convention Center, Floor: Level 100, Room 119B

Abstract

Purpose
eRevise supports students’ argument writing in the upper elementary and middle school grades by providing automated formative feedback on drafts of students’ essays targeted toward improving their use of text evidence. This paper describes theory underlying key design decisions in our system as well as evidence we have generated for our validity investigation. Finally, we discuss the design of the next generation of our system, eRevise+rf.

Theoretical Framework
In developing eRevise, we focused on argument writing in response to text because of its potential to inspire student thinking and reasoning. The ability to marshal evidence in support of claims in writing is critical to college readiness. Younger writers often lack familiarity with the discursive features associated with argumentation, such as identifying convincing evidence and explaining how it connects to a claim (Ferretti & Graham, 2019; O’Hallaron, 2014; Schleppegrell, 2004; Snow & Ucellli, 2009). As importantly, however, writing can be a powerful tool for advancing thinking and reasoning. This is as true when students are supported to generate new insights and/or extend their knowledge within a discipline (Scardamalia & Bereiter, 1987; Graham et al., 2020), as it is when they generalize their understanding of knowledge-building – i.e., become metacognitive about how to engage in a dialogic process for knowledge construction (e.g., Cavdar & Doe, 2016).

Methods
We report on a validity investigation for the use of eRevise as a formative assessment (authors, 2022). We then discuss evidence and rationale for design decisions that are guiding eRevise+rf, as well as the validity investigation we will use to investigate eRevise+rf.

Data
Our validity investigation took place in 8 public parishes (i.e., districts) in Louisiana that are representative of the state demographics. 16 English language arts teachers participated in the study. They were selected for their comfort with basic technology and access to a class set of computers to complete the online administration of eRevise. They administered eRevise to 266 fifth and sixth grade students that completed all data collection (i.e., submitted both a first draft and a revised draft of the essay and completed the post-eRevise survey items).

Results
The object of focus for the validity argument was the automated feedback generated by our AWE system. Our resultant validity investigation made the case that our feedback was understood by students, aligned with teachers’ instructional goals, and that students demonstrated evidence of learning in ways that were aligned with the feedback we provided.

Significance
While our validity investigation concluded that the evidence from 8 different research questions suggests strong potential for eRevise as a formative assessment, our work with eRevise has highlighted students’ difficulties revising in response to feedback (authors, 2020), which is a critical yet difficult skill to master, also requiring explicit feedback and scaffolding. Our enhanced eRevise system – called eRevise+RF, wherein RF stands for ‘revision feedback’ – represents our effort toward this vision. In addition to formative feedback on evidence-use, eRevise+RF also provides formative assessment of the quality of students’ first revision, a construct central for writing as a tool for knowledge production.

Authors