Paper Summary
Share...

Direct link:

Designing and Implementing Instructionally Supportive Assessment Tasks for Promoting Three-Dimensional Learning: Challenges Faced and Lessons Learned

Sun, April 7, 11:50am to 1:20pm, Metro Toronto Convention Centre, Floor: 700 Level, Room 709

Abstract

We describe our multi-institutional effort to design and implement instructionally supportive assessment tasks that integrate disciplinary core ideas, science and engineering practices, and crosscutting concepts as called for in the Next Generation Science Standards (NGSS Lead States, 2013). We present our design approach along with example tasks and rubrics, discuss results from implementation, and share challenges encountered and lessons learned from developing the tasks and using them in middle school classrooms.

The vision underlying the NGSS describes science proficiency as using and applying knowledge to make sense of phenomena or design solutions to problems (NRC, 2012). A central tenant is that core ideas, practices, and crosscutting concepts are tightly intertwined and work together to advance learning. Accordingly, the NGSS integrate these three dimensions together in the form of performance expectations that describe what is to be assessed at the end of a grade level or grade band.
One substantive challenge for assessment design is how to develop tasks that can be used during instruction to help teachers assess students’ progress toward meeting the performance expectations (NRC, 2014). Our solution leverages a principled assessment design process called evidence-centered design (ECD) (Mislevy & Haertel, 2006) for analyzing the performance expectations and specifying their essential and assessable components. ECD emphasizes the articulation of coherent, logical relationships among (1) learning goals that constitute the constructs to be measured; (2) evidence in the form of performances that should reveal these target constructs; and (3) features of tasks that should elicit the desired evidence.

We use ECD to systematically unpack performance expectations and synthesize the unpacking into multiple components called learning performances. The learning performances represent key aspects of the proficiency that students need to demonstrate in meeting the NGSS performance expectations. We use learning performances to guide the development of assessment tasks, evidence statements, and scoring rubrics. Our design process involves three phases: 1) domain analysis, which involves unpacking the dimensions in the performance expectations to understand the assessable components, 2) domain modeling, which involves constructing learning performances and creating design specifications for tasks, and 3) task construction, using design specifications to create tasks and rubrics.

We report results from a task performance study involving over 800 middle school students, collectively representing 12 schools and three districts serving diverse student populations. While many students did not achieve the highest three-dimensional scores, we found that our rubrics enabled us to identify developing three-dimensional proficiencies.

We also report results from research examining how teachers used the assessment tasks with students. When used formatively, data from students’ performance provided teachers with actionable information for instruction, notably which aspects of performance students needed more practice with. As such, the tasks can inform teachers about the extent to which their students’ proficiency is indeed three-dimensional.

A critical need exists for R&D on high-quality assessments that align with the proficiency goals of the NGSS and are instructionally suitable for classrooms. We anticipate that our design solutions and implementation insights will be valuable to assessment designers, science education researchers, and practitioners.

Authors