Individual Submission Summary
Share...

Direct link:

Do Critical Thinking Skills Predict Student Upward Mobility?

Thursday, November 13, 8:30 to 10:00am, Property: Grand Hyatt Seattle, Floor: 1st Floor/Lobby Level, Room: Leonesa 3

Abstract

Higher-order thinking skills (HOTS), sometimes called critical thinking skills or problem-solving skills, are thought to be important for lifelong success. Unfortunately, despite the importance of HOTS, we face a fundamental challenge in advancing knowledge in this area: it has been difficult to capture these skills at scale. Prior approaches at attempting to capture higher-order thinking skills at scale come with significant limitations. None of the nationally administered assessments or statewide assessments capture HOTS. Exams more specifically dedicated to capturing higher-order skills are not systematically administered and come with substantial administration costs. Some scholars have attempted to measure higher-order skills by identifying more challenging assessments, subscales, standards, or item formats (e.g., open-ended questions) within widely used exams. However, other scholars find that—across such approaches to measuring these skills—only a small minority of items actually assess higher-order skills, providing little credibility for these approaches to capturing higher-order thinking. Furthermore, depending on the approach, ~33% to ~90% of lower-order thinking items may be misattributed as HOTS, raising concerns that any conclusions drawn about higher-order thinking, in fact, fail to disentangle the two sets of skills. As a consequence, little is known about students’ development of higher-order skills at scale, limiting our understanding of how educational systems promote these skills and the long-run implications of promoting them.


Our research surmounts the fundamental measurement challenges by validating an approach to directly measure higher-order thinking using pre-existing test item-level student performance data that allows for studying these skills at scale. In particular, we pair Massachusetts’s item-level student performance from 2001 to 2023 with a novel coding of all publicly available English-Language Arts (ELA) standardized assessment items that flags items that capture HOTS. Our approach uses items that directly measure higher-order thinking to create a measure, avoiding the challenges faced by previous scholars. We then test whether supporting validity evidence exists for the measures’ use. We show that this HOTS (and LOTS) measure provides a good fit to the internal structure of pre-existing item-level data, demonstrates improvement in fit statistics compared to the model that assumes a single General ELA factor, and offers additional desirable characteristics (e.g., high standardized loadings). Moreover, the HOTS and LOTS measures enable comparisons across student demographics (race/ethnicity, socioeconomic status, gender, special education, and english language learners). Altogether, our results indicate supporting validity evidence largely exists for retroactively incorporating information on higher-order thinking skills to capture these skills using item-level test information from extant administrative data, facilitating future policy research on important questions like those of student upward mobility.

Author