Search
Browse By Day
Browse By Time
Browse By Person
Browse By Room
Browse By Committee or SIG
Browse By Session Type
Browse By Keywords
Browse By Geographic Descriptor
Search Tips
Personal Schedule
Change Preferences / Time Zone
Sign In
Group Submission Type: Formal Panel Session
There is growing recognition that there is a global learning crisis. Many children are in school but learning very little. Despite this learning crisis, it is not the case that we have no tools to improve learning. Indeed, there is now substantial evidence on effective approaches to improve learning, and increasing alignment on what (might) work to address the learning crisis in (see for example, Akyeampong et al 2023). The new challenge before us is how to scale and implement evidence-based programs effectively. Moreover, as agreement on what to try increases, it is essential to establish systems for checking that solutions being implemented work across contexts and whether adjustments, tweaks and further innovations are needed (Peters et al 2013). Systems pursuing foundational literacy and numeracy (FLN) interventions or reforms need more than the red flags that routine M&E that can wave. They need proactive feedback loops to help implementing staff identify and scale elements working well and test adjustment options for those not functioning optimally.
Recent investments projects such as the What Works Hub (WWH), the Implementation Research Fund and the Jacobs Foundation’s work to bridge research and practice aim to meet these raised system needs. The Oxford Blatnavik-led WWH investment has a pillar (3) that focuses on implementation research. Similarly, the BMGF implementation research fund exists solely to fund, test and promote the idea that more and better solutions for more children’s learning can be found and fomented if we set up to test and adjust according to evidence as we scale what works. Finally, Jacobs Foundation has experimented with groups of investments like Leveraging Evidence for Action to Promote Change and School Action Learning Exchange, as well as country-specific research ecosystem investment (Education Evidence Labs), learning lessons from each. It is the aim of this panel to generate collective action - among the panelists and with interested audience members - to build upon these examples and investments to challenge the status quo of systems scaling without ongoing evidence generation and application and shape future investments to build smarter, adaptable solutions for learning at scale.
These investments rest on principles of stakeholder engagement, trusting collaborations, curious implementers, as well as themselves testing what works and adapting as they go. They take up the generation and use of evidence from different starting points and in different levels of the FLN ecosystems in various countries. Looking at each example and then across them, this panel shares the different needs addressed by different types of support, the approach to supporting implementation research that each takes and what we can learn from their efforts. Across the papers we can gather lessons for investing in and undertaking implementation research. The panel will share implications for how FLN investments are designed and funded. It should raise questions about the status quo of designs used in multi-million and multi-year FLN efforts funded by bilateral and multilateral players in international education. It should build towards a community of researchers and implementers as successful protesters - whose voice begins to coalesce around learnings that enable the expression of displeasure with the system constructively and builds together towards a clear set of demands for improving these systems (de Hann et al 2020).
Together these investments in evidence generation and use as well as the assumptions and pathways that they test have the potential to inform more effective systems. Realizing this potential requires being in conversation with each other, consciously testing hypotheses about what works for each team using their own funding and learning about what others are testing and learning. This session is an invitation to join the conversation about what we know so far about investing in implementation research, what we hope to learn together soon and what questions remain on the horizon. We look forward to engaging as a community on this topic at CIES 2024 and beyond.
Three ways to invest in evidence: pros, cons and key lessons for promoting change - Donika Dimovska, Jacobs Foundation; Samuel Kembou, Jacobs Foundation
What works for what works? Early lessons in promoting implementation science - Noam Angrist, Youth Impact; What Works Hub for Global Education, University of Oxford; Michelle Kaffenberger, What Works Hub for Global Education, Georgetown University