Session Submission Summary
Share...

Direct link:

Rethinking early grade reading benchmarks: Innovations in setting oral reading fluency standards

Thu, April 18, 3:15 to 4:45pm, Hyatt Regency, Floor: Atrium (Level 2), Waterfront B

Group Submission Type: Formal Panel Session

Proposal

Over the past decade, oral reading fluency (ORF) has become the most prevalent proxy measure for children’s overall reading abilities and for the quality of education systems. Low- and middle-resource countries have benefited from technical support to establish minimal ORF benchmarks, based on Early Grade Reading Assessment (EGRA) data, and to track the percentage of pupils meeting or exceeding these benchmarks. This latter calculation has provided a means of comparing the relative success of donor-driven EGR initiatives across the globe. While UNESCO Institute for Statistics (with support from a wide variety of partners) is developing Reporting Scales (for SDG indicator 4.1.1) to enable common definitions of “minimum proficiency” for more comparable benchmarks across countries and assessments, the need for improved EGRA benchmarking remains a priority for many donors, implementers and national policymakers.

This panel will focus on applying lessons learned from years of EGRA benchmarking activities, while presenting examples of innovative approaches used for setting ORF benchmarks across a number of countries. The presentations will examine both technical issues surrounding the setting of benchmarks, as well as implications for the standard setting process with regard to stakeholder engagement.

The chair will introduce the session with a brief background on how oral reading benchmarks have traditionally been set using EGRA data, with a focus on common pitfalls and limitations. This will be followed by four presentations. The first presentation will focus on lessons learned from a recent benchmarking report that analysed the process across 40 languages, followed by how best practices from the report were used to set benchmarks for 5 languages in India. This presentation will cover the process of defining standards and setting benchmarks empirically (using a variety of methods), while additionally providing a review of effective methods for engaging policy makers and other stakeholders in the benchmarking process. The second presentation will provide an overview of a modified Angoff Method approach for setting oral reading fluency and comprehension benchmarks. The purpose of the presentation is to describe how a performance-based standard setting methodology for establishing benchmarks on oral reading fluency and comprehension in Lebanon can better align EGRA benchmarking activities with other well-established protocols for standard setting. Furthermore, this approach is similar to the policy linking work that is on-going with regard to SDG reporting. The third presentation will build on the first two by providing a comparison of minimal oral reading fluency benchmarks established using local expert knowledge (i.e. presentation 2) versus data-driven processes (i.e. presentation 1). This presentation focuses on two benchmarking activities in Rwanda and examines the impact of different benchmarking approaches on the percentage of pupils meeting grade level expectations. Additionally, advantages and disadvantages of each method will be discussed, with a particular focus on the level of Ministry involvement and ownership of the standard setting process. The fourth and final presentation moves beyond the technical aspects of benchmarking to examine how the Early Grade Reading Barometer can be used to support the setting of benchmarks and targets for reading outcomes. As a dynamic tool that offers an interactive use of EGRA data, the Barometer is a valuable resource for understanding benchmarks in relation to program performance and provides a user-friendly, hands-on approach for assisting governments and stakeholders to set appropriate, data-driven targets.

Finally, our discussant will guide a discussion about how to make meaning of the issues raised by the various perspectives on the benchmarking process, as well as potential opportunities and lessons learned.

Sub Unit

Chair

Individual Presentations

Discussant