Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Unit
Browse By Session Type
Annual Meeting Housing and Travel
In Event: Hilton Roundtable Session 13
In Roundtable Session: Practical Measures in Service of Instructional Improvement: Attending to Issues of Rigor
The purpose of this paper is to describe our efforts to design and implement a practical measure to help us identify key predictors of student experience of relevance and coherence in curriculum.
Research has shown that materials with strong intra-unit and cross-unit coherence can build student understanding over time (Fortus & Krajcik, 2012). However, even well designed materials do not always cohere from the perspective of students. Important connections between ideas can be enhanced or lost, depending on how teachers introduce and adapt materials (Fogleman, McNeill, & Krajcik, 2011). This variability in students’ experience of coherence shapes their opportunities to learn and can profoundly influence their learning outcomes. Understanding variation in how students experience coherence is thus an important target for continuous improvement.
In practical measurement, a key aim is to identify strong predictors of valued outcomes, particularly more distal ones (Yeager et al., 2013). In this study, we sought to identify predictors of a more proximal outcome, student experience of coherence on a day-to-day basis within project-based science units. Of particular interest was to identify clusters of student experience that were associated with students’ experience of coherence, defined here as a sense of how that day’s activities contribute to an answer to the driving question of the unit. Driving questions are a central feature in project-based learning for motivating student learning (Blumenfeld, Soloway, Marx, Guzdial, & Palincsar, 1991).
Over a period of three years, the co-design team iterated over the design of practical measures and units simultaneously, analyzing data from practical measures and working to refine hypotheses about what clusters of student classroom experience might help explain variation in coherence. Students completed 4-minute surveys on notebook computers on a weekly basis, reporting on the lesson being taught, their experience of coherence, and a range of other experiences: emotions felt, perceptions of relevance to themselves and their class, participation in science practices, and contributions to group activity.
Three predictors were related to coherence, though only at a moderate level. First, when students reported that when others explained things, it helped them learned, they were more likely to say that the day’s activity contributed to them answer the driving question of the unit (r = .32). Similarly, when students reported that the lesson was personally relevant to them (r = .25) and when they were excited by the lesson (r = .22), they were more likely to say that the lesson contributed to an answer to the driving question.
Though these correlations are modest, presenting data to teachers in the project has helped reinforce the importance of connecting lessons both to students’ personal experience and to the overall driving questions as a means to maintain high levels of student engagement. In addition, within the project has led to more focused design attention on the instructional components that support teachers in helping make these connections. As such, they present one way that data from practical measures that focuses on proximal outcomes can guide improvement efforts within design-based research-practice partnerships.