M&E Specialist · Researcher · Epidemiologist

EmmanuelNeneOdjidja

Better Evidence for Bettering Lives.

0+Years Experience
0Publications
0+Countries
Scroll

About

If a programme works, it should be proven using sound evidence, not anecdotes.

Emmanuel Nene Odjidja

Emmanuel Nene Odjidja

M&E Specialist · Researcher · Epidemiologist

Originally from Ghana, I have spent over a decade working at the frontlines of global health and international development, from pastoralist communities in South Sudan to health facilities in rural Burundi and research institutions in the United Kingdom.

I hold a Master of Science in Global Health (Distinction) from Queen Margaret University in Edinburgh, with a focus on epidemiology. My professional conviction is simple: if a programme works, it should be proven using sound, methodologically rigorous evidence, not anecdotes.

My career has been defined by a singular commitment to building credible evaluation systems in the most challenging operational environments. I have designed and managed evaluations of programmes aimed at preventing violent extremism, strengthening health systems, and addressing the intersection of climate change, food insecurity, and conflict across West Africa, East Africa, the Sahel, North Africa, and South Asia.

My research spans maternal and child health, infectious disease control, nutrition, health financing, and the nexus between climate change, food insecurity, and violent extremism. I am bilingual with full professional proficiency in English and French.

Outside of work, I am a committed runner and semi-marathonist, still chasing the dream of completing a full marathon.

Education

MSc Global Health (Distinction), Queen Margaret University, Edinburgh

Languages

English (Native) · French (Full Professional)

Expertise

Impact EvaluationEpidemiologyProgramme EvaluationMixed MethodsHealth SystemsM&E DesignStata/RDiDPSMRCT

Experience

A Decade at the Frontlines

Building evidence across fragile and conflict-affected settings, from pastoralist communities to multi-country evaluation programmes.

2021 – Present

Geneva

M&E Specialist: Research, Design & Learning

GCERF — Global Community Engagement and Resilience Fund

Design and manage evaluations of PVE programmes across the Sahel (Burkina Faso, Mali, Niger), Tunisia, and Sri Lanka. Lead evaluation design, quality assurance, and evidence synthesis. Co-authored research on the climate–conflict–food insecurity nexus.

2024 – Present

Section Editor, Case-Based Evaluations

Journal of MultiDisciplinary Evaluation (JMDE)

Provide editorial leadership for the case-based evaluations section, guiding its thematic direction and standards for methodological rigour and practical relevance, while managing submissions end-to-end from initial screening and peer review coordination to final decisions.

2018 – 2021

Burundi

Research, Monitoring & Evaluation Technical Lead

Village Health Works

Led impact evaluations, set up M&E systems, and published peer-reviewed research on malnutrition, neonatal survival, hypertension, and TB. Founded the Kigutu M&E Institute, training 32 clinicians and staff on epidemiology, evaluation, and health systems.

2016 – 2018

South Sudan

M&E Advisor / Research Lead

AVSI Foundation

Conducted SMART nutrition surveys, designed quasi-experimental evaluations, and researched infectious disease control among pastoralist populations in humanitarian settings.

2013 – 2015

Ghana

Programme & Research Officer

Ghana Health Service / CRC / USAID

Early career in programme design, monitoring, and classroom-level learning assessments in education and health.

Research

Publications

Peer-reviewed research spanning infectious disease, nutrition, maternal health, health systems, and the climate-conflict nexus.

2025

Tuberculosis mortality and drug resistance among patients under TB treatment before and during COVID-19 in Burundi

BMC Infectious Diseases · Iradukunda, Getnet & Odjidja

2024

Small Fish Big Impact: Improving Nutrition during Pregnancy and Lactation, and Empowerment for Marginalized Women

Nutrients (MDPI) · Saha, Ng, Odjidja et al.

2024

Survival of newborns and determinants of their mortality in Burundi: A prospective cohort study

Research Square (Preprint) · Ndayishimiye et al. incl. Odjidja

2022

The effect of health financing reforms on incidence and management of childhood infections in Ghana: A matching DiD impact evaluation

BMC Public Health · Odjidja et al.

2021

Bibliometric analysis of the top 100 cited articles on HIV/AIDS

Annals of Infection · Gatasi, Musa & Odjidja

2020

Coronavirus disease 2019 and viral hepatitis coinfection: Provide guidelines for integrated screening and treatment

Journal of Medical Virology · Odjidja, Laurita Longo, Rizzatti & Bandoh

2020

2030 Countdown to combating malnutrition in Burundi: Comparison of proactive approaches for case detection

International Health (Oxford) · Odjidja et al.

2019

Delivery of integrated infectious disease control services under the new ANC guidelines: A service readiness assessment in Tanzania

BMC Health Services Research · Odjidja, Gatasi & Duric

AI for Good Lab

PRAXIS

An AI for Good lab pioneering the use of artificial intelligence in programme evaluation and development research.

The Challenge

Evaluation in fragile and conflict-affected settings demands methodological rigour under conditions that make rigour difficult. Yet the tools available to evaluators have barely changed in decades. PRAXIS exists to close that gap, bringing artificial intelligence into the service of better evidence.

The Approach

PRAXIS builds open-source AI tools that encode twelve years of field evaluation experience into systems any researcher or practitioner can use. By combining large language models with validated evaluation frameworks, the lab is making methodological expertise accessible to organisations that have never had the budget to hire specialist evaluators.

AI-assisted evaluation design spanning 20+ methodological approaches

Automated framework selection adapted for fragile and conflict-affected contexts

Open-source tools that democratise access to evaluation expertise

Field-tested methods bridging the gap between academic rigour and operational reality

In Practice

When an organisation needed to evaluate a countering violent extremism programme across three Sahelian countries, PRAXIS tools guided the selection of a contribution analysis framework, identified context-appropriate indicators, and structured a mixed methods design that balanced the need for causal evidence with the realities of operating in insecure environments.

Illustrative data

Relevance2.14.3
Coherence1.83.8
Effectiveness1.54.1
Sustainability1.23.6
Before PRAXIS After PRAXIS

The Result

0%

reduction in evaluation design time

Organisations using PRAXIS tools report faster evaluation design cycles while maintaining the methodological consistency required for credible evidence across multi-country programmes.

Commentary

Writing & Ideas

Commentary

The Evaluation Gap: Why Development Programmes Fail to Prove Their Worth

Billions flow into development programming each year, yet fewer than one in five programmes undergoes a rigorous impact evaluation. The consequence is not merely academic. Without credible evidence of what works, funders recycle failed approaches, practitioners lose institutional memory, and the communities these programmes claim to serve bear the cost of well-intentioned guesswork. Closing this gap requires more than technical fixes. It demands a fundamental shift in how organisations value and invest in evaluation from the outset.

Read

“The most valuable evaluations are not necessarily the most methodologically sophisticated ones. They are the ones designed with enough pragmatism to survive first contact with the field.”

Contact

The best evidence is built in partnership.

If you are exploring a research collaboration, designing an evaluation framework, or rethinking how evidence shapes policy, I would welcome the conversation.