The EEF Education Database has supported practice and policy initiatives across the EEN EdLabs. Now, with support from the EEF, Innovations for Poverty Action (IPA) have undertaken a study which explores the opportunities and challenges for integrating the repository into policymaking processes in the IPA Embedded Evidence Lab models in Ghana and Côte d’Ivoire. In this blog IPA and EEF share five key takeaways from the study.
Introduction
Over the past few decades, education researchers have built a substantial body of evidence on how to improve learning outcomes – covering everything from classroom pedagogical techniques to efficient ways to operate education systems at scale. While this research offers a significant opportunity for policymakers to strengthen education policy across the globe, in many contexts, uptake of this evidence remains limited.
For many policymakers, the issue is not whether research exists, but whether it’s useful, timely, and applicable to their reality. They often report difficulties distilling actionable insights from academic work or express skepticism about whether findings generated in one country can translate to their own context. In Low- and Middle-Income Countries (LMICs), institutional constraints compound these challenges. These constraints include limited dedicated resources, insufficient technical capacity, and the absence of structured processes to integrate evidence into decision-making — all of which restrict the systematic use of research in education policy.
Meeting this challenge requires intermediaries—teams that bridge the gap between research and policy by identifying evidence needs, synthesizing and contextualizing findings, and connecting them with decision-makers to support action. With support from the Education Endowment Foundation (EEF), Innovations for Poverty Action (IPA) recently completed a study exploring how IPA’s Embedded Evidence Labs can serve this role of linking global research to local realities. Embedded Labs are teams that work hand-in-hand with governments in institutionalizing the use of data and evidence in their policy process, enabling them to improve their decision-making, policies, and programs.
The study explored strategies for integrating the EEF Education Database, a global repository of over 5,000 education studies, into policymaking processes in Ghana and Côte d’Ivoire. Through 35 interviews with repository owners, intermediaries, lab members, policymakers, and external partners (along with validation workshops in both countries) five lessons emerged on how evidence repositories and intermediaries can better serve policymakers and strengthen education systems worldwide.
Lesson 1: Gaps Remain Between the Global Evidence Base and the Challenges Facing Policymakers and Implementers
The global evidence base is vast but unevenly distributed. There are still too few studies from LMICs, limited representation of local scholars, and poor alignment with policymakers’ needs (Das, 2025). Research also shows that policymakers are more likely to trust evidence generated within their own countries — sometimes even discounting evidence produced abroad (Vivalt and Coville, 2024).
Academic studies also often omit crucial details about how interventions were implemented — the context, delivery strategies, or enabling conditions that made success possible. Policymakers frequently need more than causal evidence about what works — they need practical information about how to implement programs effectively. Without this implementation knowledge, adapting evidence for new settings becomes guesswork.
To bridge this gap, research must become more local, relevant, and contextualized. This means investing in research led by local scholars, documenting implementation details more thoroughly, and ensuring repositories highlight contextual factors alongside results. Only then can global evidence truly inform local policy.
Lesson 2: Governments Want Evidence, but Effective Synthesis Requires Planning
The good news is that governments want to use evidence, and ministries routinely reach out to intermediaries for support in answering pressing policy questions. But the reality of political timelines often means these requests come at the last minute — driven by tight decision deadlines or urgent policy reforms.
While this urgency creates opportunities for evidence use, it also presents challenges. Rapid timelines can limit the ability to produce rigorous syntheses or to build trustful, long-term partnerships with academic researchers and institutions. When ministries focus narrowly on immediate needs, they risk postponing the more strategic or structural questions that could unlock larger system-wide improvements. For Embedded Labs and other intermediaries, the challenge is balancing short-term responsiveness to provide rapid answers to urgent questions, while also helping governments think long-term and set a research agenda that addresses deeper, systemic issues.
One promising approach is the creation of Research and Learning Agendas — structured documents that consolidate and articulate a ministry’s evidence priorities over a multi-year period. Through collaborative workshops and stakeholder consultations, Research and Learning Agendas map key challenges and translate them into specific research questions. By establishing these priorities in advance, governments and partners can allocate resources wisely, commission higher-quality evidence syntheses when time allows, and still leave room for responsiveness when inevitable urgent policy questions arise.
Lesson 3: Strong Systems, Not Just Skilled People, Sustain Evidence Use
Even the best repositories and platforms are only as effective as the teams that use them. Ministries often employ talented staff, but high turnover is common. When trained personnel move on, institutional knowledge is lost, and evidence practices risk being reset to square one. Many Ministries also lack standardized guidelines for evidence synthesis, which means individual research summaries can vary significantly in rigor and credibility. This inconsistency can undermine confidence in the findings, especially when policymakers are already cautious about relying on external evidence.
To address this challenge, governments and intermediaries need to invest in both people and processes. This means developing clear protocols and standard operating procedures for evidence synthesis, including training programs to equip new staff with the skills to commission, assess, and interpret research. It can also mean establishing long-term partnerships with organizations that specialize in synthesis, ensuring access to high-quality expertise.
These steps not only strengthen the credibility of evidence syntheses but also help institutionalize evidence use so it outlasts individual staff members. When evidence synthesis processes are well-documented and consistently followed, they become part of the institutional fabric rather than depending on individual champions.
Lesson 4: Embedding Evidence is About Relationships, Not Just Data
For research to influence policy, it must be trusted — and that trust comes from relationships. Policymakers are more likely to use evidence when they value the process of its generation and feel ownership over the insights it produces.
Embedded Labs help foster this trust by acting as evidence brokers. They don’t just hand over reports; they work alongside policymakers to identify pressing questions, contextualize findings, and support decision-making. This collaborative model ensures evidence is not an abstract academic exercise but a tool that directly responds to the realities officials face.
Crucially, Embedded Labs also play roles beyond research. They help build networks, mobilize resources, and strengthen governance systems. They facilitate the institutional changes — such as budget allocations, staffing models, or legal frameworks — that allow evidence use to take root and flourish. Truly embedding evidence means making it part of the daily routines, incentives, and decision-making structures of government. Global repositories have an important role to play in this process, but unless evidence is embedded in the culture and operating practices of ministries, it risks ‘sitting on the shelf’.
Lesson 5: Above all, an Effective Evidence Synthesis Process Should Meet Policymakers Where They Are
Repositories and synthesis platforms like EEF’s Education Database offer powerful tools for organizing and disseminating evidence. However, their ultimate value lies not in the elegance of their databases, but in their ability to serve the needs of policymakers.
Most policymakers are not looking for just abstract research summaries. They want practical insights that can be translated into implementable programs. They want to know: What would this look like in practice? What resources are required? What steps should we take next?
This is where intermediaries like Embedded Labs shine. By engaging directly with ministries, they can translate global evidence into actionable policy proposals and project plans. They help overcome the common problem of ‘analysis paralysis’, where officials feel overwhelmed by information and struggle to move from evidence to action.
As repositories expand and new tools — such as artificial intelligence — enhance the speed and sophistication of synthesis, it remains essential to keep the end-user in mind. Evidence synthesis must always be designed to meet policymakers where they are, in the midst of messy and complex decision-making processes.
Conclusion
The experiences of Embedded Labs in Ghana and Côte d’Ivoire highlight a central truth: governments genuinely want to use evidence, but face real constraints in doing so. Political pressures, capacity gaps, and questions of relevance all make it difficult to move from research to action.
The path forward requires combining global repositories with locally embedded approaches. By doing so, governments can make education policy more evidence-driven, context-sensitive, and sustainable. IPA’s Embedded Labs provide one model for bridging the gap between global research and local policy, ensuring that evidence not only informs academic debates but also shapes the classrooms and education systems of tomorrow.
Looking ahead, IPA will continue working with EEF and other partners to strengthen how evidence is synthesized and used — including exploring the potential of artificial intelligence to make synthesis more responsive and relevant. The future of evidence in education lies not just in building bigger databases, but in embedding evidence into decision-making cultures so that it consistently improves the lives of students around the world.










