Session Summary
Share...

Direct link:

Unpacking Test-Taker (Dis)Engagement: Innovative Approaches to Enhance Validity in Educational Assessment

Fri, April 12, 3:05 to 4:35pm, Convention Center, Floor: First, 122B

Session Type: Coordinated Paper Session

Abstract

The validity of inferences drawn from educational assessments is contingent not only on the quality of the test items but also on the level of engagement exhibited by the test takers. Disengagement can distort scores, potentially underestimating a student's abilities. This session delves into the multifaceted nature of test (dis)engagement, the methodologies to detect it, and address its effects. Presentation 1 examines the impact of disengagement-adjusted scoring, recommending score-adjustment procedures consider both rapid guessing and performance change information. Presentation 2 uses Monte Carlo simulations to quantify how disengagement affects the recovery of latent growth curve model parameter estimates and exemplifies how to mitigate related issues. Presentation 3 uses Monte Carlo simulations to examine the parameter recovery of the multidimensional Nested Logit Model, which measures ability and test-taking effort jointly to gain a deeper comprehension of test-takers’ response behaviors. Presentation 4 employs AI algorithms to analyze students' navigation patterns in the NAEP assessments, offering insights into test engagement. Lastly, Presentation 5 introduces and validates the Cognitive Modeling Method, a theory-driven approach to detect not-fully-effortful responses to further improve inferences drawn from assessments. Together, these presentations highlight the importance of (dis)engagement in assessments and provide novel approaches to ensure valid score interpretation.

Sub Unit

Session Organizer

Chair

Individual Presentations

Discussant