Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
Bluesky
Threads
X (Twitter)
YouTube
U.S. student math proficiency tends to decline from elementary to middle school, suggesting that many students are not learning concepts effectively (U.S. Department of Education, 2022).
Making functional use of errors is a key part of the mathematics learning process (Reusser, 2000; Tulis et al., 2016). Understanding more about error patterns in response to intervention and
across grades is essential in developing effective interventions (Bailey et al., 2020). The present study examined the association between baseline problem-solving performance and change in
performance following implementation of a supplemental problem-solving curriculum, “Our Math World” with 3rd-7th graders (N = 88). Specifically, we asked:
1. How do patterns of problem-solving accuracy and the prevalence of error types change across grades?
2. How predictive are pre-test error patterns (conceptually poor vs. sound) for students’ response to intervention (i.e., higher accuracy and/or fewer conceptually-poor errors)?
We examined pre- and post-test single-step problem-solving data (6 problems each; see Figure 1). Responses were analyzed for accuracy and error type (coding adapted from Kingsdorf and
Krawec [2014]). Errors were categorized as reflecting conceptually poor or sound understanding of the required process (See Table 1). Error analyses considered categories as a proportion of
students’ errors when errors were committed.
As expected, overall accuracy increased across grades (p < .001), and from pre- to post-test (p = .035, see Table 2). However, patterns of errors differed across grades. Seventh graders
demonstrated the highest accuracy at pretest; however, they also demonstrated the highest proportion of conceptually-poor errors at pretest, higher than each other grade (p = .021, p =
.024, p = .055 for 3rd, 4th, and 5th grade comparisons, respectively). Post-test error patterns did not differ by grade. Next, we conducted regression analyses (see Tables 3, 4) examining pre-test predictors of
post-test accuracy and conceptually-poor errors (relative to conceptually-sound errors). Pre-test accuracy was a significant predictor of post-test accuracy (β = .622, p <.001), controlling for
grade. Furthermore, when pre-test conceptually-poor errors were added in the same model, this was a significant predictor of post-test accuracy above and beyond pre-test accuracy (β = -.245, p
= .005). This model explained 57.4% of the variance in post-test accuracy, with conceptually-poor errors explaining 11.5% of post-test accuracy variance beyond pre-test
accuracy. In the model predicting post-test conceptually-poor errors, pre-test accuracy was a significant predictor (β = -.282, p = .039), controlling for grade; however, pre-test
conceptually-poor errors were not (p = .73).
Findings suggest that additional attention toward the nature of students’ errors is needed to support struggling students. Students who demonstrated conceptually-poor errors (relative to
conceptually-sound ones) were less likely to demonstrate improvement in accuracy post-intervention, suggesting they require additional remediation. Additionally, 7th graders who
committed errors demonstrated a higher rate of conceptually-poor errors, suggesting that students who struggle with math word-problem solving in earlier grades may continue to fall farther
behind. Future work is needed with larger samples to increase power for detecting grade effects. Additionally, development of teacher training to identify conceptually-poor errors in student
work is needed, as well as early interventions to support conceptual understanding.
Maegan Colbert, Virginia Tech
Isabel Valdivia, Virginia Tech
Jisun Kim, Virginia Tech
Tamika L. McElveen, Miami University (OH)
Sarah R. Powell, University of Texas at Austin
Amanda S Mayes, Delaware Department of Education
Ma Bernadette Andres Salgarino, Santa Clara County Office of Education
Sara Schmitt, University of Oregon
David J. Purpura, Purdue University
Caroline B. Hornburg, Virginia Tech