Paper Summary
Share...

Direct link:

Developing and Piloting the Insight Learning System

Sat, April 9, 8:15 to 9:45am, Marriott Marquis, Floor: Level Two, Marquis Salon 17

Abstract

Objectives
1. Describe the components of the Insight Learning System (ILS)
2. Describe the process used to design the components
3. Explore pilot results that suggest linkages between system components

Theoretical Framework
The purpose of the ILS project is to iteratively design, develop, and test a set of integrated components: a digital game, online performance tasks, instructional modules, and professional development for teachers. Together, these components constitute a system that is coordinated around a mathematics learning progression (LP) targeting third grade students’ understanding of ideas related to geometric measurement of area. During 2014, researchers identified an initial set of hypotheses about instruction, assessment, and student learning based on learning sciences research. We incorporated these hypotheses into the design of prototypes of the system components. Thus, the theoretical frameworks of primary interest are those from the field of math education, particularly empirical research on how students learn and understand geometric measurement of area (Barrett et al., 2011; Battista et al., 1998; Common Core Standards Writing Team, 2012; Sarama & Clements, 2009).

Methods
Using a design-based research approach (Collins, Joseph, & Bielaczyc, 2004), the research team has now carried out three complete iterations of a hypothesize-test-revise cycle. During each iteration, we put prototype materials in front of students and conducted cognitive labs. We then used results to revise the prototypes for the next round of testing. During iteration 3, we piloted the entire system in six third-grade classrooms in a Midwestern elementary school. We will complete our fourth round of piloting in both U.S. and Australian schools in the fall of 2015.

Data Sources
During iterations 1-2, we conducted cognitive labs with 27 students. We recorded each of those interviews and transcribed and coded the recordings. In addition to response process evidence, we collected student responses to the learning and assessment activities in 6 classrooms in spring 2015, as well as teacher observations of student performance, data on teachers’ completion of professional development modules, and teacher responses to professional development prompts. We will collect this same data in a number of U.S. and Australian classrooms in the fall of 2015.

Results
We have completed analysis of cognitive labs from iterations 1-2. Results provide initial support for the stages included in the LP and some of the hypothesized sequences of those stages. We are still completing analyses of the spring 2015 pilot data, and data collection for fall 2015 has not yet commenced. We will analyze relationships between the various assessment components, between the assessment and learning components, and between teacher PD participation and measures of teacher knowledge and student performance.

Significance
LPs hold promise for guiding design of learning system components. However, there is much work to be done to realize their full potential. As Corcoran, Mosher, and Rogat (2009) observe, “learning progressions are as yet unproven tools for teaching and learning” (p. 5). This study will provide evidence for evaluating how well systems built around LPs can support teaching and learning.

Authors