Paper Summary
Share...

Direct link:

Cognitive Diagnostics for Calculus and Mechanics

Sun, April 12, 11:45am to 1:15pm PDT (11:45am to 1:15pm PDT), InterContinental Los Angeles Downtown, Floor: 7th Floor, Hollywood Ballroom I

Abstract

Purpose
We demonstrate how cognitive-diagnostic computer adaptive testing (CD-CAT) supports flexible, timely, fine grained formative assessment of learning for physics and calculus without sacrificing measurement accuracy or reliability. We situate this work within a broader initiative to build scalable, cross-disciplinary diagnostic assessments for STEM courses from 8th grade through college gateway courses using the LASSO platform–a free, research-aligned formative assessment system.
Framework
We conducted the work within the conceptual assessment framework for developing instructor facing assessments within evidence centered design. We focus herein on how the CD-CAT fits within the student model and evidence models within that framework. CD models assess student understanding at the level of specific learning objectives (LOs), while CAT algorithms tailor question selection to each student’s performance. Together, CD-CATs offer efficient, individualized assessments that provide timely, fine-grained insights into student thinking.
Methods
The research employed an iterative, mixed-methods approach. First, we identified LOs based on the OpenStax textbooks and AP test standards. We then coded the items from existing research based assessments (RBAs) for the skills to produce a Q-matrix linking each item to each skill. ``Deterministic Input, Noisy 'And' gate'' (DINA) models then tested the fit of each Q-matrix to the data from each RBA. We reviewed the suggested changes from the DINA analysis and then tested each updated Q-matrix using DINA.
Data Sources
The data consisted of 15,921 college student responses across six post-course assessments collected with the LASSO platform: the calculus concept inventory, the calculus concept assessment, the precalculus concept assessment, the force concept inventory (FCI), the force and motion conceptual evaluation (FMCE), and the energy and momentum conceptual survey (EMCS),
Results
We identified 26 LOs for introductory calculus that we categorized into five groups. We coded the three mathematics RBAs for these five groups of LOs as the Q-matrix was too sparse for analysis with the larger set of 26 LOs. The models fit the data well with an RMSEA and SRMSR below the 0.05 cutoff for an excellent fit.
We identified 37 LOs for introductory mechanics. We focused on 20 of these LOs with three overlapping subsets of 8 LOs that were measured by each physics RBAs. The fit statistics were excellent to acceptable for the FCI and EMCS for RMSEA and SRMSR. Results were mixed for the FMCE with an acceptable RMSEA2 but a high SRMSR.
Significance
This work establishes a CD-CAT framework for assessing student understanding in both introductory calculus and mechanics, using data from research-based assessments to test our student models of LOs. These models may enable instructors and researchers to identify how specific mathematics and physics LOs contribute to learning challenges across the two disciplines, improving the precision of instructional support and educational research. The system offers flexibility in both the content and timing of assessment, allowing for adaptive, targeted evaluation aligned with diverse instructional goals or research needs. Together, these contributions lay the foundation for a scalable cognitive diagnostic system spanning secondary and postsecondary STEM education.

Authors