Search
On-Site Program Calendar
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Search Tips
Change Preferences / Time Zone
Sign In
Bluesky
Threads
X (Twitter)
YouTube
Objectives
This paper describes a novel application of the Rasch model. It illustrates how we integrated Rasch findings into a critical participatory action research design to guide the iterative development and refinement of a culturally sustaining measure of Black and Latina/o middle and high school students’ math engagement.
Perspectives
Applications of item-response theory models traditionally focus on educational assessment (Bartolucci & Scrucca, 2010). However, their use in supporting the development and validation of “non-academic outcomes” (e.g., engagement) is starting to emerge (Author et al., 2019). The Rasch model is particularly useful in practitioner-researcher partnerships when coupled with a “ruler” analogy that helps practitioners view items as ticks on a ruler, describing how items are ordered along the dimension of interest (Author et al., in press). Less observed in the literature are examples of how the Rasch approach can support developing and validating culturally sustaining measures of non-academic outcomes.
Data Sources and Methods
Annual cycles of qualitative and quantitative data collection informed the iterative refinement of the measure. In 2023, the Math and Science Engagement Survey (Fredricks et al., 2016) was administered (n = 2,056 middle and high school students; 36% response rate). The survey contained 33 items measuring the four “traditional” dimensions of engagement: cognitive, behavioral, emotional, and social. In 2024, the initial version of the adapted measure of math engagement (AM-ME) was administered (n = 2,781 middle and high school students; 46% response rate). It contained 72 items across 8 constructs identified by existing research, qualitative data, and exploratory factor analysis. For both surveys, Rasch analyses examined each subscale for item fit, item targeting, and differential item functioning (DIF) as a function of student race/ethnicity.
Results
For both surveys, no items had concerning misfit values (values < 2.00; Linacre, 2017) and most items targeted students at lower levels of engagement. For example, relative to the range of student placement along the “Positive Feelings towards Math” AM-ME dimension (-5.29 to 7.07), the range of item placement was narrow (-0.49 to 0.53) and a substantial proportion of students at the highest end of “Positive Feelings towards Math” indicated a ceiling effect. Both surveys had items with DIF (DIF contrast ≥ 0.42 and Mantel-Haenszel chi-squared p < .05; Zwick et al., 1999). For instance, within the “Positive Feelings towards Math” subscale, Black students were more likely to agree to, “I can be myself,” than Hispanic or “Other” students (DIF contrast = -0.51, p < .05 and -0.64, p < .001, respectively).
Significance
Findings from the Rasch analyses informed conversations with the AM-ME Research Group about improving item targeting and DIF. We conducted subsequent focus groups to define higher levels of each dimension and identify possible sources of DIF. By describing this iterative item improvement process, this paper illustrates how to leverage the Rasch model to refine the conceptualization of non-academic constructs from a culturally sustaining perspective while improving the precision of measurement of these constructs across their full range.