Grow the .002% of all global development projects that are evaluated ex-post closure for sustainability

Grow the .002% of all global development projects that are evaluated ex-post closure for sustainability

It seems like ‘fake news’ that after decades of global development so few evaluations would have peered back in time to see what was sustained. While I was consulting to the Policy Planning and Learning Bureau at USAID, I asked the head of this M&E department who does ex-post sustainability evaluation as I knew USAID had done some in the 1980s, Cindy Clapp-Wincek answered ‘No one, there are no incentives to do it.’ (She later became our advisor.)

Disbelieving, I did a year of secondary keyword research before devoting my professional consulting life to advocating for and doing ex-post evaluations of sustained outcomes and impacts. I searched USAID, OECD, and other bilateral and later multilateral donors’ databases and found thousands of studies, most of which were inaccurately named ‘ex-post’ or ‘post-closure’ studies.  Some of the roughly 1,000 projects I looked at at USAID and OECD that came up under ‘ex-post’, ‘ex post’, ‘post closure’ were final evaluations that were slightly delayed, a few were evaluations that were at least one year after closure, but were desk studies without interviews. Surprisingly, the vast majority of final evaluations found were those that only recommended ex-post evaluation several years later to confirm projected sustainability.

 

 

 

 

 

 

 

 

In 2016 at the American Evaluation Association conference, a group of us did a presentation. In it, I cited these statistics from of 1st year of Valuing Voices’ research:

  • Of 900+ “ex-post” “ex post” “post closure” documents in USAID’s DEC database, there were only 12 actual post-project evaluations with fieldwork have been done in the last 20 years
  • Of 12,000 World Bank projects – only 33 post-project evaluations asked ‘stakeholders’ for input, and only 3 showed clearly they talked to participants
  • In 2010 Asian Development Bank conducted 491 desk reviews of completed projects, and returned to 18 actual field-based post-project evaluations that included participant voices; they have done only this 1 study.
  • We found no evaluations by recipient governments of aid projects’ sustainability

12 years of research, advocacy and fieldwork later, the ‘catalysts’ database on Valuing Voices now shows actual fieldwork-informed evaluations by 40 organizations that had actual ex-posts that returned to the field to ask participants and project partners what was sustained, highlighting 92 ex-posts.

How many ex-post project closure evaluations have been done? .002% of all projects. The 0.002% statistic looks at just public foreign development aid from 1960 (not even counting private funding such as foundations or gifts to organizations, which isn’t tracked in any publicly available database). Calculating aggregated OECD aid statistics (excluding private because it’s only recent data) over 62 years $5.6 trillion by 2022 (thanks to Rebecca Regan-Sachs for the updated #s).

I then estimated 3.000 actual ex-posts which comes from 2,500 JICA projects plus almost 500 other projects that I have either found looking through databases all across the spectrum from governments and multilaterals (almost 100 in our catalysts, and am assuming there must be 400 others done in the 1980s-2000 like USAID and the World Bank).

Without a huge research team it is improssible to aggregate data on the total number of projects by all donors. So I extrapolated from project activity disbursements of one year (2022) for Mali on the www.foreignassistance.gov page. In my 35 years of experience, Mali, where I did my doctoral research, typifies he average USAID aid recipient. They had 382 projects going in 2022. I rounded up to 400 projects x 70 years (since 1960 when OECD data began) x 100 countries by just one donor (of the 150 possible recipient countries, to be conservative). This comes to 2.8 million projects. So if we take 39 OECD countries as donors (given most have far less to give than US), in total 109 million publicly funded aid projects disbursed $5.6 trillion since 1960. While final evaluations are industry standard, only .002% is the estimated number of ex-post evaluations of projects the were evaluated with data from local participants and partners of the 109 million projects .

This became Valuing Voices focus, and we created an open-access database for learning, and conducted our own  My team and I identified 92 ex-posts that returned to ask locals what lasted, what didn’t, why, and what emerged from their own efforts. We also created evaluability checklists and created a new evaluation, Sustained and Emerging Impacts Evaluation that included examining not just what donors put in place to last, but also what emerged outcomes from local efforts to sustain results with more limited resources, partnerships, capacities and local ownership/motivation. These four drivers were found by Rogers and Coates for USAID’s food security exit study in 2015). We have done 15 ex-posts for 9 clients since 2006 and shared Adaptation Fund ex-post training materials in 2023.

 

Yet the public assumes we know our development is sustainable. 2015’s ‘Sustainable Development Goals‘ focused aid on 17 themes, which was to generate $12 trillion more in annual spending on SDG sectors than the;$21 trillion already being invested each year. Nonetheless, a recent UN report states that there is now a $4 trillion annual financing gap to achieve the SDGs. All this funding goes to projects that are currently implemented, not to evaluate what had been sustained from past projects that already closed. Such learning from what succeeded or failed, or what emerged from local efforts to keep activities and results going is pivotal to improving current and future programming is almost wholly missing from the dialogue; I know, I asked multiple SDG evaluation experts.

 

Why do we return to learn so rarely? There are many reasons, the most prosaic among them being administrative.

  • When aid funds are spent over 2-10 years, projects are closed, evaluated at the end, ‘handed over’ to national governments, and no additional funding exists to return ‘ex-post’ closure to learn.
  • Next is the push to continue to improve lives through implementation which means low rates of overhead allocated to M&E and learning during, much less after closure.
  • Another is the assumption that ‘old’ projects differ so much from new ones, but there are few differences. After all there are only so many ways to grow food, feed the malnourished, educate children; evaluating ‘old’ projects can teach ‘new’ projects.
  • A last major one, from Valuing Voices’ research of 12 years may be the largest: Fear of admitting failure. Please read Valuing Voices’ 2016 blog highlighted many Lessons about Funding, Assumptions and Fears (Part 3). One US aid lobbyist told me in 2017 that I must not share this lack of learning about sustained impacts because it could imperil US aid funding; I told her I had to tell people because lives were at stake.
  • Overall, there is much to learn; most ex-post evaluations show mixed results. None show 100% sustainability and while most show 30-60% sustainability, none are 0% sustained either. If we don’t learn to replicate what worked and cease what didn’t now, then future programming will be as flawed and successes, especially brilliant emerging locally designed ex-post outcomes such as Niger’s local funding of redesign of health incentives will remain hidden.

 

Occasionally donors invest in sets of ex-post learning evaluations such as USAID’s ‘global waters’ seven water/ sanitation evaluations linked to the E3 Bureau taking sustainability as a strategic goal. Yet the overall findings from USAID’s own staff of these ex-posts Drivers of WASH study were chilling. While 25 million gained access to drinking water and 18 million to basic sanitation, ‘they have largely not endured.’ But the good news in such research is that the donor learned that infrastructure fails when spare parts are not accessible and maintenance not funded or performed, which can be planned for and addressed during implementation by investing in resources and partnerships. They learned that relying on volunteers is unreliable and management needs to be bolstered, which can lead to some implementation funding to be focused on capacities and local ownership. We can plan better for sustainability by learning from ex-post and exit studies (see Valuing Voices’ checklists in this 2023 article on Fostering Values-Driven Sustainability).

 

And since 2019, three climate funds, the Adaptation Fund, the Global Environmental Facility, and the Climate Investment Funds have turned to ex-post evaluations to look at sustainability and longer-term resilience and even transformation, given environmental shocks may take years to affect the project sites. The Adaptation Fund has done four ex-posts, with more to come in 2024/25, and the CIF is beginning now. The GEF has done a Post-Completion Assessment Pilot for the Yellow Sea Region . Hopeful!

Presenting Lessons on (post-project) Sustained and Emerging Impact Evaluations from the U.S. AEA Conference

 

Presenting Lessons on (post-project) Sustained and Emerging Impact Evaluations from the U.S. AEA Conference

 

Dear readers, attached please find the Barking up a Better Tree: Lessons about SEIE Sustained and Emerging Impact Evaluation presentation we did last week at the American Evaluation Association (AEA) conference in Atlanta GA [1]. I had the pleasure of co-presenting with Beatrice Lorge Rogers PhD, Professor, Friedman Nutrition School, Tufts University (aka the famous Food for Peace/ Tufts Exit Strategy study [2]), Patricia Rogers PhD, Director, BetterEvaluation, Professor, Australia and New Zealand School of Government (where we recently published guidance on SEIE [3]) and Laurie Zivetz PhD, International Development Consultant and Valuing Voices evaluator.

 

We integrated our presentations from Africa, Asia and Latin America into this fascinating overview:

1.Sustained and Emerging Impact Evaluation: global context

2.SEIE: definitions and methods

3.Case studies: findings from post-project evaluations

4.Designing an SEIE: Considerations

5.Q&A — which fostered super comments, but since you couldn’t come, please tell us what you think and what questions you have…

 

There are amazing lessons to learn about design, implementation, M&E from doing post-project evaluation.  We have also grown in appreciating that sustainability can be tracked throughout the project cycle, not just during post-project SEIE evaluation.

We’ll be building this into a white paper or a … (toolkit? webinar series? training? something else?). What’s your vote ___? (I know in this US election season, so… :)).

 

What would you like to get to support your learning about Sustained and Emerging Impact Evaluations? Look forward to hearing from you- Jindra@ValuingVoices.com

Enjoy!

 

 

 

 

The full presentation is available here:

https://valuingvoices.com/wp-content/uploads/2016/11/Barking-up-a-Better-Tree-AEA-Oct-26-FINAL.pdf

 

Sources:

[1] Cekan, J., Rogers, B. L., Rogers, P., & Zivetz, L. (2016, October 26). Barking Up a Better Tree: Lessons about SEIE (Sustained and Emerging Impact Evaluation). Retrieved from https://valuingvoices.com/wp-content/uploads/2016/11/Barking-up-a-Better-Tree-AEA-Oct-26-FINAL.pdf

[2] Food and Nutrition Technical Assistance (FANTA). (n.d.). Effective Sustainability and Exit Strategies for USAID FFP Development Food Assistance Projects. Retrieved from https://www.fantaproject.org/research/exit-strategies-ffp

[3] Zivetz, L., Cekan, J., & Robbins, K. (2017, May). Building the Evidence Base for Post-Project Evaluation: Case Study Review and Evaluability Checklists. Retrieved from https://valuingvoices.com/wp-content/uploads/2013/11/The-case-for-post-project-evaluation-Valuing-Voices-Final-2017.pdf