Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
Bluesky
Threads
X (Twitter)
YouTube
Background
Self-testing significantly improves student learning (e.g., meta-analysis by Yang et al., 2021). However, few studies have examined its effectiveness in real-world (higher education) settings, such as gateway mathematics courses, or its impact on related, non-tested content. This is needed to inform evidence-based (digital) learning environments and recommendations for a more equitable education system in which all students can thrive through the use of effective learning strategies. Despite concerns that self-testing may not support content transfer as complexity increases (Anderson et al., 1994; van Gog & Sweller, 2015), there is evidence that transfer to non-tested, conceptually related problems is possible (Bjork et al., 2014; Butler, 2010). Using within-exercise randomization, the present study examined (a) whether self-testing affects student performance in a gateway math course, and (b) whether it enables transfer to nonpracticed math problems.
Method
Two cohorts of business administration students (Cohort 22/23: N1 = 940, 48% female; Cohort 23/24: N2 = 861, 51% female) of a gateway mathematics course participated in the study. Students completed math problems weekly and three practice tests over the semester (Figure 1). A within-exercise randomization ensured that all students were assigned to and could benefit from weekly online self-testing exercises; however, students worked on partly different problems. Fixed effects for each math problem served as control variables. Participation was incentivized: students could earn credit toward their final exam. The practice test problems were different for the two cohorts.
Results
OLS regression showed similar intent to treat (ITT) and participation effects in both cohorts (Table 1, columns 1–2; columns 5–6). Panel fixed effects and instrumental variable regressions controlling for self-selecction estimated treatment-on-the-treated (TOT) effects close to ITT effects (Table 1, columns 3–4; columns 7–8): Students were about 2–3% more likely to solve a practice test problem that they had practiced in similar forms in their weekly online exercises (e.g., the same problem but with different numbers). Students performed best on similar math problems, significantly worse on mid-transfer problems (e.g., different wording), and worst on far-transfer problems (new types of math problems; Table 1). The results including the interactions between the treatment variables (which specific problems were practiced) and the transfer condition show differences between the cohorts (for which the practice test problems were different). For cohort 22/23, the TOT effect is reduced for mid- and far-transfer problems for cohort 22/23 (Table 2, columns 3–4). For cohort 23/24, the TOT effect is only present for the mid-content problems (Table 2, columns 7–8).
This study is one of the first to provide causal evidence of the effectiveness of self-testing in a gateway math course. Notably, the analyses suggest that the treatment effect can hold for mid- and far-transfer math problems, but that the construction of the practice test problems requires great care. This demonstrates the effectiveness of practice testing in (higher-order) math and underscores the considerable promise of practice testing to support student learning in challenging educational contexts in the direction of equitable educational renewal.