Session Submission Summary
Share...

Direct link:

New Technical Contributions to Measuring Foundational Skills for SDG 4.1.1a

Mon, March 24, 4:30 to 5:45pm, Palmer House, Floor: 3rd Floor, Salon 6

Group Submission Type: Formal Panel Session

Proposal

When the Sustainable Development Goal (SDG) 4 was introduced in 2015, it shifted the focus from merely enrolling children in school to ensuring that they are also learning. The goal called on the global community to assess whether students are reaching at least a minimum proficiency level in reading and mathematics by the end of grades 2/3 (Indicator 4.1.1a), at the end of primary school (4.1.1b), and at the end of secondary school (4.1.1c). However, as of the halfway point to the 2030 deadline, only 37 countries are reporting results for grades 2/3, 101 countries are reporting at the end of primary school, and 85 are reporting at the end of secondary school (Author, 2023). To be useful for global comparison, data must be comparable across countries and over time, nationally representative, and meet minimum quality standards. Ideally, these measurements should not only serve the purpose of reporting but also contribute to developing national capacity in learning assessments, and be used to improve curricula, teacher education, and assessment practices through relevant new and emerging technologies.

On 23 October 2023, at the meeting of the UN-IAEG (Inter-agency and Expert Group on SDG Indicators), the indicator for SDG 4.1.1a was “demoted” from a Tier I to a Tier II indicator due to insufficient coverage. There was a significant risk that low country coverage may result in its removal from the list of global SDG indicators in 2025. This demotion raised concerns among foundational learning advocates, including the Global Coalition for Foundational Learning, about the potential implications. The concern is that countries might misinterpret this demotion as a signal that the issue itself lacks importance, when in fact the demotion is due to inadequate reporting rather than a lack of relevance.

To address the issue of demotion and improve the number of countries reporting, various proponents suggested utilizing relatively new tools such as the Early Grade Reading Assessment (EGRA), the Foundational Learning Module (FLM) of the Multiple Indicators Cluster Survey (MICS), and the International Common Assessment of Reading and Learning (ICARe) from the People’s Action for Learning (PAL) Network assessments. However, these tools were primarily designed for advocacy, program design, monitoring, and evaluation, rather than for global reporting and comparison. Importantly, they are not explicitly aligned with UNESCO Institute for Statistics (UIS) minimum proficiency standards, and their psychometric properties are not well documented, which has prevented UIS from using data from these assessments for global reporting.

At a meeting of the Global Alliance to Monitor Learning (GAML), convened by UIS in December 2023 to discuss reasons for non-reporting, UIS was requested to develop eligibility criteria - both psychometric and procedural - to determine when these or any other assessments could be used for global reporting. Later in 2023, UIS convened a Technical Advisory Group (TAG) to further refine the criteria and propose ways to unpack the reporting of SDG 4.1.1a. This was particularly aimed at addressing two key measurement issues relevant to low- and lower-middle-income countries:

• Language barriers in early grades: Home languages are often used for instruction in early grades, but this practice typically ends by the time students reach the end of primary school, where SDG 4.1.1b becomes relevant. This transition presents challenges in measurement and benchmarking. For example, progress in reading a phonetic language differs from progress in languages with complex sound-to-print correspondence or intricate scripts.

• Basic reading skills: Many children in grades 2 and 3 are still developing the basic, or precursor, elements of reading. These children have not yet mastered "reading to learn," making traditional assessment methods less effective for this age group.

To address these challenges, UIS called for the establishment of benchmarks for reading precursor skills, especially in countries where children have not yet achieved reading comprehension mastery. This initiative will enable these countries to track progress toward achieving minimum proficiency levels in reading comprehension.

This panel will introduce the revised eligibility criteria (Author et al., 2023), the benchmarking method for precursor skills (Author et al., 2024), and the proposed EGRA tool that aligns with these criteria. In line with the theme of CIES 2025, the panel will highlight the importance of collecting high-quality student learning outcome data digitally using appropriate tools and utilizing this data to inform effective policy decision, interventions, ongoing student progress monitoring. The panel will present three papers.

1. Eligibility Criteria for Use of an Assessment to Report on SDG 4.1.1. This paper will outline the minimum criteria that national student assessments must meet to be eligible for reporting on SDG 4.1.1.

2. A Method of Setting Benchmarks on Precursor Skills through Reading Comprehension. This paper will present an Item Response Theory (IRT) based approach of estimating benchmarks on precursor reading skills.

3. UIS Criteria and EGRA: Proposal for EGRA. This paper introduces a revised EGRA tool that is specifically designed to more accurately measure foundational reading skills. The revisions address issues such as floor effects and enhance the accuracy of comprehension assessments, ensuring that the tool meets the updated eligibility criteria for SDG 4.1.1a reporting.

Sub Unit

Organizer

Chair

Individual Presentations

Discussant