Inter-American Development Bank (IDB) – where have your ex-post evaluations, and learning from them, gone?

Inter-American Development Bank (IDB) – where have your ex-post evaluations, and learning from them, gone?

A Linkedin colleague, Gillian Marcelle, Ph.D. recently asked me about ex-posts by the Inter-American Development Bank (IDB) as more Caribbean accelerators/incubators were planned without learning from previous identical tech investments. Here is what I found, and if anyone knows more, please contact me, as it is not reassuring. Also, some were internal ‘self-evaluations’, some were desk reviews, and only a few involved going to the field to ask aid recipients about what lasted, which is typical for multilaterals (ADB and the IBRD do the same). Given Valuing Voices’ focus on participant’s voices in results, there was an attempt to focus on those, but this report did not make it clear which were which so highlights are presented below.

In 2003 IDB created an ex-post policy,Ex Post Policy (EPP) in October 2003, which mandated two new tasks to OVE: the review and validation of Project Completion Reports and the implementation of ex post project evaluations.”. These were under the Board’s request for “a commitment to a ‘managing for results’ business model.” 2004’s ex-posts were seen as “the first year of the implementation of the EPP, all 16 evaluations can be considered part of the pilot and the findings presented in this report refer to the entire set of ex post evaluations.” Further, “the general evaluative questions proposed by EPP are first “… the extent to which the development objectives of IDB-financed projects have been attained.” and second “… the efficiency with which those objectives have been attained.”

They spent over $300,000 unsuccessfully evaluating six of the projects. In part this was due to data quality. “six had an evaluation strategy identified in the approval stage, most had abandoned the strategy during execution prior to project closure and, with one exception which produced data that could be used to calculate a treatment effect, none had produced quality evaluative information…. No [Project Completion Report] PCR provided adequate information regarding the evolution of development outcomes expected from the project or an update with respect to the evaluation identified at the time of approval. For the other six, they found that the expected results did not match what would be the sustained results. Some were better than expected while more were worse. “A critical finding across all projects is the lack of correspondence between the reflexive estimates and the treatment effect estimates. In practically all cases, the estimates were different.” How sustainably “Improved” are “Lives” as IDB’s logo touts?

 

 

In 2004 IADB chose 16 projects to evaluate and dropped four for a variety of reasons. The remaining dozen projects ex-post evaluated were on land development, improving neighborhoods, and cash programming. There were data quality and comparability issues from the onset. In the land [tenure] ‘regularization’: “Six of the projects mention ex post evaluations in loan proposals, but none have been completed to date. OVE was successful in retrofitting a subset of outcomes expected for three projects: an attrition rate of 50%.”  The neighborhood improvements had positive and negative results, with ‘retrofitting’ being needed regarding data. For the four evaluated, the overall conclusion was mixed. Both, that the projcts led to “greater coverage of certain public services.” and for two cases, “this impact was more pronounced for the poorest segments included in the treated population.” Nonetheless, much more was unachieved. “Beyond this, very little else can be said. The impact on the objectives related to human capital formation and income were not demonstrated. In the case of health interventions, perhaps the intervention type most directly linked to sanitation services; there has been no demonstrated link between the interventions and outcomes, even for the poorest segments of the beneficiary population. There was also no consistent evidence showing an increase in variables related to housing values.”

Regarding cash programming, there were individual evaluations that showed promise but only after statistical analysis of a control group, something which is sorely lacking in most foreign aid evaluations. An IDB project in Panama “shows that in some cases the reflexive evaluation, in fact, understated the true program treatment effect. The development outcome of this project was the reduction in poverty. A reflexive evaluation (the gross effects) of the incidence of poverty suggested that not only was the project unsuccessful but that it actually contributed to worsening poverty; the opposite of its intent. However, a treatment effect evaluation (the net effect) that compared “similar municipalities” shows that municipalities benefiting from FIS funds had a significant decline in poverty relative to comparable municipalities that did not receive FIS financing; the project had clear positive development outcomes”.

The IDB staff consulted in 2005 about the results “questioned whether the analysis of closed projects that were not required to include the necessary outcomes and data at the time of approval was a cost-effective use of Bank resources” which may be a reason why the Bank decided against doing more, in spite of many ex-post findings contradicting expected results. Astonishingly, since then, only one summary of a Jordanian ex-post in 2007 was found, but it is questionable that it is an actual ex-post closure. At a minimum, one would expect that the Bank would ensure data quality improved, and planned strategies would actually be done.

Finally, presumably a bank cares about Return on Investment. As a former investment banker, I would be concerned about the lack of learning, given the low cost of such learning versus the discrepancies found between expected and actual sustainability. Specifically, the extremely low cost of the six evaluations that were done ($113K each, more precisely a cost of .001-.21% of the program value), which is a pittance compared to the millions in loan values. Given that more were not done – or at least publically shared- in the 18 years since, sustainability-aware donors, beware.

Unlike this multilateral, I’ve been busy with two ex-post evaluations which I hope to share in the coming months… Let me know your thoughts!