Session Submission Summary
Share...

Direct link:

Perspectives on Measuring Learning Remotely: Evidence from and Reflections on Four Different Remote Learning Assessments

Tue, February 21, 6:30 to 8:00pm EST (6:30 to 8:00pm EST), Grand Hyatt Washington, Floor: Constitution Level (3B), Constitution C

Group Submission Type: Formal Panel Session

Proposal

In early April 2020, to halt the spread of COVID-19, an estimated 1.6 billion learners globally were impacted by school closures. For the first time in human history, an entire generation of children had their education disrupted. It was clear that when schools closed, far too many national education systems were ill-prepared to deal with the challenges of providing good quality learning opportunities remotely in a timely manner, and in a way that could be effectively measured.

Innovative learning approaches did rise to the surface during this time, such as organizations distributing paper materials and broadcasting lessons through radio and television as well as through mobile phones and online, in contexts where resources allowed. However, those strategies were a lifeline in an unprecedented global crisis and therefore, looking forward, we should strive to be more prepared for the next time.

How can school systems be better prepared to adapt to and combat disruptions, such as natural disasters and the COVID-19 pandemic? In particular, how can we remotely assess what distance learning methods actually work for children in these circumstances? And how can we ensure children can continue to learn and be assessed during times of crises?

In our panel discussion, we will examine the role that remote learning assessments can play in creating opportunities for the most marginalized, in preparing teachers to educate students in uncertain, insecure moments, and in supporting and contributing meaningfully to education for all. It is our contention that having validated and reliable tools to assess children remotely will greatly improve learning and teaching in both formal and informal settings.

In paper one, the authors share the results from four Remote Assessment of Learning (ReAL) alpha pilot sites and discuss preliminary results from the beta phase of testing in an additional five sites (Occupied Palestinian Territory, Sudan, the Philippines, Mozambique, and Bangladesh). The authors discuss the goal of launching a fully validated measure as a global public good in 2023 and the ways it can be used at each level of the education system.

In paper two, the authors discuss best practices for using low-tech platforms to deliver phone-based assessments of learning. The authors discuss implementation techniques and validity checks for phone assessments through the lens of six country studies where the approach has been implemented (Botswana, Kenya, Nepal, India, Uganda and the Philippines). Finally, the authors will speak of future possibilities and applications of the approach as a cost effective, rapid cycle alternative and/or supplement to in-person learning assessments.

In paper three, the authors share the results from a study that assessed the validity of administering the Early Childhood Development Index 2030 (ECDI2030) through phone-based interviews by comparing it to the standard face-to-face administration in Nepal. The authors illustrate evidence for validity and reliability of phone-based ECDI2030 and discuss the need to address challenges in the phone-based interview administration.

In paper four, the authors introduce Gobee a gamified digital assessment tool designed for the Education in Emergency (EiE) sector, that children from grades 1-3 can play autonomously, and that has the potential to measure math and reading competencies as well as some social-and-emotional learning (SEL) skills. Gobee prototype is currently in Phase 2 of development and testing. The authors share the results of Phase 1 and present the validity plan for Phase 2.

Sub Unit

Chair

Individual Presentations

Discussant