Search
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Committee or SIG
Browse By Session Type
Browse By Keywords
Browse By Geographic Descriptor
Search Tips
Personal Schedule
Change Preferences / Time Zone
Sign In
Group Submission Type: Formal Panel Session
In early April 2020, to halt the spread of COVID-19, an estimated 1.6 billion learners globally were impacted by school closures. For the first time in human history, an entire generation of children had their education disrupted. It was clear that when schools closed, far too many national education systems were ill-prepared to deal with the challenges of providing good quality remote learning opportunities accompanied by psychometrically sound assessments in a timely manner.
During this challenging time, innovative learning approaches emerged as organizations distributed paper materials and delivered lessons through various mediums such as radio, television, mobile phones, and online platforms, wherever resources permitted. While these strategies served as a lifeline during the global crisis, it is crucial that we learn from this experience and strive to be better prepared for future challenges. Despite the progress made in overcoming the COVID-19 pandemic, the challenges faced by education systems in emergency and hard-to-reach settings still persist.
How can school systems be better prepared to adapt to and combat disruptions, such as natural disasters and the COVID-19 pandemic? In particular, how can we remotely assess if distance learning methods actually work for children in these circumstances? How can we ensure that children continue to learn and be assessed during times of crises? And, importantly, how can data on children’s foundational skill acquisition be used to inform policies and programs to better serve these communities? To answer these questions, this panel reports emerging findings of the psychometric evaluation of the Remote Assessment of Learning (ReAL) tool.
In our panel discussion, the ReAL global team comprised of members from Save the Children US, Save the Children International, Yale University, and the Free University of Berlin (FUB) will be joined by country offices and academic partners from three of seven countries where the ReAL beta pilot is being tested. The panel will provide a holistic overview of data collection, validity and reliability efforts across contexts with the goal of developing and releasing a fully validated remote assessment of learning tool that can be utilized as a global public good in the future.
In addition to a focus on the psychometric evaluation of ReAL, the panel will speak to the significance of establishing interinstitutional partnerships that prioritize representation and equity in research. The ReAL project actively fosters collaboration between a wide range of diverse organizations, including local academic institutions, government agencies, and non-governmental organizations. These partnerships ensure inclusivity, comprehensiveness, and a reflection of diverse needs and experiences; therefore, we believe, strengthening the validation process of ReAL.
In paper one, we provide global psychometric evidence for the reliability and validity of the ReAL tool using data from Cambodia, El Salvador, Mozambique, Niger, the Occupied Palestinian Territories, the Philippines, and Sudan. We will report on the extent to which the domains of learning are structurally consistent across countries, any invariance that is present by gender, age of the child, in-school and out-of-school contexts, or country, and the criterion validity of the tool by testing it against assessments such as EGRA, EGMA and measures of social-emotional development. Findings will indicate recommendations for the use of ReAL as a fair and unbiased assessment of foundational skills across a variety of contexts.
In paper two, an overview of the high access modality of the beta pilot in Palestine is provided. The exercise targeted about 1,200 child-caregiver dyads who were selected to reflect the demographic, geographic and educational system diversity of the overall population. A complex system of validation was endorsed withing the validation exercise that included inter-rater validation, retesting and use of criterion measure for a sizable sub-sample. A coding system offering a unique identity for each of the records all through the administration of the tool, inter-rater validation, re-testing and administration of the criterion measure helped provide quality checks allowing for proper analysis. The results of the pilot study provided valuable insights into the assessment approach and foundational skills of children in Palestine and similar contexts. The data collected through ReAL offers an objective assessment of students' foundational skills, highlighting areas of strength and areas that require further attention. These findings shed light on existing challenges and gaps in the education system, serving as a basis for targeted interventions and policy decisions towards improving the quality of education.
Paper three provides an overview of the beta pilot study in four rural schools in the Cabo Delgado, Metuge District, Mozambique. The high access modality version of the assessment tool was adapted to align with Mozambique's educational curriculum and language, involving collaboration between teachers and researchers at the Faculty of Education, Eduardo Mondlane University, and supported by Save the Children in Mozambique. The beta pilot included the sample selection, assessment modality, adaptation of the assessment tool, research objectives, key findings and challenges.
Paper four provides evidence for the reliability and validity of the ReAL high-access tool in Cambodia. This validation study assessed the inter-rater and test-retest reliability, the structural validity utilizing confirmatory factor analysis, the criterion validity, and the measurement invariance by key socio-demographic characteristics. We used the data from a total sample of about 1000 children aged 5 to 12 years together with their respective caregivers in three districts (Kampong Trolach, Rolea B'ier and Samaki Meanchey districts) of Kampong Chhnang province, Cambodia. The results will inform the relevant government agencies, in particular the Ministry of Education, Youth and Sport (MoEYS), and development stakeholders to adopt and adapt the remote assessment tool. This tool is beneficial due to its cost efficiency, ability to help build a system of monitoring for the MoEYS’s Education Quality Assurance Department (EQAD), and ability to be used in the context of hard-to-reach children, particularly in floating villages and schools in Tonle Sap Lake of Cambodia.
Cross-country psychometrics and framing of ReAL - Sascha Hein, Free University Berlin; Elizabeth Hentschel, Yale Child Study Center; Clay Westrope, Save the Children US; Julia Taladay, Save the Children; Liliana Angelica Ponguta, Yale University
Validation Study of the Remote Assessment of Learning in Cambodia - Rada Khoy, Save the Children International; Borin Srey, Save The Children International; Sakem Kong, Save The Children Cambodia; Puthichan Pha, Save The Children Cambodia; Sorachana Seng, Save The Children Cambodia; Sithon Khun, Save the Children; Sarin Sar, Ministry of Education, Youth and Sport
Validation Study of the Remote Assessment of Learning in Palestine - Ali Nashat Shaar, Palestinian Child Institute; Nermin Kamal Habaybeh, An Najah National University
Validation Study of the Remote Assessment of Learning in Mozambique - Adriano Simao Uaciquete, Universidade Eduardo MondlaneGhent University; Belmiro Novele; Dyson Joe Likomwa, Save The Children International; Nelson Zavale, Save The Children International