Paper Summary
Share...

Direct link:

The Initial Intent of Common Core State Standards for English Language Arts and Aligned Instruction and Tests: A Task Analysis

Mon, April 11, 11:45am to 1:15pm, Convention Center, Floor: Level Two, Room 207 A

Abstract

Purpose
Many ELA scholars appreciated the shifts represented by the U.S. Common Core State Standards in English Language Arts. The increased focus on close reading, evidence-based argument, and disciplinary literacies reflected longstanding concerns about the lack of detail-oriented rich reading and extended writing in American secondary schools and saw potential in the CCSS-ELA to address this lack (Applebee & Langer, 2011; Authors, 2013).

Yet the manner in which U.S. Common Core State Standards-driven instruction was initially conceptualized (e.g., Coleman & Pimental, 2012) has been critiqued as not empirical (Pearson, 2013). New PARCC, Smarter Balanced, and College Board assessments of Common Core competence are yielding further critique (Cain & Oakhill, 2006; FairTest, 2013). The task analysis reported in this paper explored how the implied intended tasks of the CCSS-ELA align with CCSS instructional tasks and assessments. It asked: What literacy instructional practices seemed intended in initial standards documentation? How do these practices align with actual CCSS instructional tasks and assessments? What are the gaps between the CCSS and their instantiations in CCSS instructional tasks and assessments?

Data Sources and Methods
Data sources included four samples of published high school ELA instructional materials, and testing frameworks and sample items from the PARCC, Smarter Balanced, and College Board assessments. The analyses examined how key constructs in the CCSS-ELA were described and operationalized across sources, including text complexity, academic language, close reading and argumentation. More specifically, the inquiry used two instructional design task analysis models, top down and matrix, to consider how the standards aligned or not with instructional materials and assessments (Martin, 2011), with a top-down analysis starting with the goals represented in the standards document, and the matrix analysis beginning with the task demands of curricula and assessments.

Preliminary Findings
Preliminary findings suggest that CCSS-ELA implied a wide range of generative literacy instructional experiences across genres in order to provide young people with “college and career readiness” (NGA, 2010). However, aligned instructional tasks accounted for much narrowing of what seemed to be intended in the CCSS (e.g., range of text complexity and types versus only complex texts, test-like argumentative writing versus a wide range of academic writing). Unsurprisingly, testing frameworks and assessment samples offer even narrower tasks, including complex permutations that may yield discriminating test items but could yield classroom test preparation activities that are, essentially, a waste of students’ time given the goals of the Standards. The full paper will include tables comparing literacy tasks across these sources to support the preceding assertions.

Conclusions
Many educators were initially optimistic that CCSS-ELA implementation could yield Singapore-like change in our students’ literacy performance. This analysis suggests the narrowing that standards dreamers feared, one that encourages test preparation rather than generative reading and writing (Shanahan, 2014). We hope this paper situates us in renewed appreciation for the kind of multilayered literacy instruction needed to fully address the aspirations represented in these standards.

Author