Session Submission Summary
Share...

Direct link:

Educational technology and implementation science: Using A/B testing to iteratively improve and scale educational interventions

Wed, March 26, 11:15am to 12:30pm, Palmer House, Floor: 7th Floor, LaSalle 1

Group Submission Type: Formal Panel Session

Proposal

A/B testing and adaptive experiments are quick, nimble evaluation methods gaining traction in the technology sector but still underused in the social sector. They entail continuous rounds of testing, learning, adapting, and improving programs. This process is fast, affordable, and generates policy-responsive evidence through ongoing experiments. Unlike traditional randomized controlled trials (RCTs) that often assess a program's efficacy in a binary fashion, A/B testing focuses on fine-tuning and optimizing existing programs to determine what works best. This iterative nature means that interventions can be continuously refined based on real-time feedback, impacting their effectiveness and adaptability. This is a particular advantage in resource-constrained education systems, such as those in the Global South, where there is an urgent need to improve the translation of idealized program designs to practical implementation across highly diverse classroom settings. This panel, featuring practitioners and researchers, will highlight the application of A/B testing in education, presenting findings from diverse contexts with a focus on implementation science.

Paper 1 offers an overview of A/B testing methodologies and their relevance in educational settings through the lens of Youth Impact’s pioneering work in this field. It discusses the fundamental principles of A/B testing in social sector, its advantages over traditional testing methods, and practical considerations for implementation. This foundational presentation sets the stage for understanding the subsequent case studies.

Paper 2 discusses how the nonprofit Rocket Learning has leveraged A/B testing to refine its in-person and digital interventions with care workers and parents in India, highlighting key insights from ongoing experiments aimed at enhancing engagement and learning outcomes among young children. The discussion will include challenges faced, strategies adopted, and the broader implications for policy and structural reform in the early childhood development sector.

Paper 3 introduces a structured, stage-based learning framework developed by IPA’s Right-Fit Evidence Unit as a foundation for effective A/B testing. This framework helps organizations to identify and prioritize key learning questions. It then explores how to assess these questions alongside an organization's capabilities and resources to determine whether and when A/B testing can enable rapid, causal identification of effective product variations for educational technology interventions.

Rounding out the panel, our discussant from the Gates Foundation will bring in a funder’s perspective to provide insights on how adaptive experiments can be leveraged to meet the learning goals of policymakers and practitioners. They will situate the presented research within broader advancements in the literature. They will synthesize the findings, draw connections between the papers, and highlight the significance of adaptive testing and A/B testing in driving educational innovations.

These presentations collectively aim to advance the use of A/B testing for programmes that are scaling and showcase how rigorous adaptive testing methodologies can revolutionize educational research and implementation. The panel aligns with the theme of "Envisioning Education in a Digital Society" by illustrating how innovative data-driven, digitally facilitated approaches like A/B testing can unlock creation of fast and affordable policy-responsive evidence to improve education for all children.

Sub Unit

Chair

Individual Presentations

Discussant