Paper Summary
Share...

Direct link:

Paper-Pencil and Computer Adaptive Test Formats: Comparing Student Psychometrics on a Mathematical Problem-Solving Assessment

Wed, April 8, 7:45am to Sun, April 12, 3:00pm PDT (Wed, April 8, 7:45am to Sun, April 12, 3:00pm PDT), Virtual Posters Exhibit Hall, Virtual Poster Hall

Abstract

Assessments are a critical pedagogical tool used in education but often require significant testing time and effort. Our study examined whether a computer adaptive version of an existing paper-pencil mathematical problem-solving test could improve delivery efficiency and student psychometric performance through a more targeted testing experience. Results showed improved student test score reliability, higher separation, and smaller standard errors of measurement compared to the static test, with reduced administration time. CAT also functioned effectively using dichotomously scored constructed-response items rather than multiple-choice questions. Findings provide evidence that CAT can enhance score precision and efficiency for this mathematical problem-solving test, further supporting broader CAT applications in educational measurement contexts.

Authors