Where do European Institutions rank on donor quality?

ODI Background Notes, June 2012. Authors: Matthew Geddes

“This paper investigates how to interpret, respond to and use the evidence provided by recent donor quality indices, using the European Institutions as an example.

The debate on aid impact is longstanding and the tightening of budgets in the current financial crisis has led to a renewed focus on aid effectiveness, with the most recent iterations including three academic indices that rank the ‘quality’ of donors as well as the Multilateral Aid Review (MAR, 2011) by the UK Department for International Development (DFID).

These exercises are being used to assess donor comparative performance and foster international norms of good practice. The MAR is also being used to guide the allocation of DFID funds and identify areas of European Institution practice that DFID seeks to reform. This paper investigates how to interpret, respond to and use the evidence they provide, focusing on the European Institutions, major donors them­selves and, taken together, DFIDs largest multilat­eral partner.

The paper presents scores for the European Institutions, reassesses this evidence and identifies issues that could make the evidence less robust, before  working through several examples to see how the evidence that the indices present might best be applied in light of these criticisms.

The paper concludes that the indices’ conflicting results suggest that the highly complex problem of linking donor practices to aid impact is probably not a problem best suited to an index approach. On their own the indices are limited in what they can be used to say robustly, and together, produce a picture which is not helpful for policy-makers, especially when being used to allocate aid funding.”

BEHIND THE SCENES: MANAGING AND CONDUCTING LARGE SCALE IMPACT EVALUATIONS IN COLOMBIA

by Bertha Briceño, Water and Sanitation Program, World Bank; Laura Cuesta, University of Wisconsin-Madison, Orazio Attanasio, University College London
December 2011, 3ie Working Paper 14, available as pdf

“Abstract: As more resources are being allocated to impact evaluation of development programs,the need to map out the utilization and influence of evaluations has been increasingly highlighted. This paper aims at filling this gap by describing and discussing experiences from four large impact evaluations in Colombia on a case study-basis. On the basis of (1) learning from our prior experience in both managing and conducting impact evaluations, (2) desk review of available documentation from the Monitoring & Evaluation system, and (3) structured interviews with government actors, evaluators and program managers, we benchmark each evaluation against eleven standards of quality. From this benchmarking exercise, we derive five key lessons for conducting high quality and influential impact evaluations: (1) investing in the preparation of good terms of reference and identification of evaluation questions; (2) choosing the best methodological approach to address the evaluation questions; (3) adopting mechanisms to ensure evaluation quality; (4) laying out the incentives for involved parties in order to foster evaluation buy-in; and (5) carrying out a plan for quality dissemination.”

On evaluation quality standards: A List

 

The beginnings of a list. Please suggest others by using the Comment facility below

Normative statements:

Standards for specific methods (and fields):

Meta-evaluations:

  • Are Sida Evaluations Good Enough?An Assessment of 34 Evaluation Reports” by Kim Forss, Evert Vedung, Stein Erik Kruse,Agnes Mwaiselage, Anna Nilsdotter, Sida Studies in Evaluation 2008:1  See especially Section 6: Conclusion, 6.1 Revisiting the Quality Questions, 6.2 Why are there Quality Problems with Evaluations?, 6.3 How can the Quality of Evaluations be Improved?, 6.4 Direction of Future Studies. RD Comment:  This study has annexes with empirical data on the quality attributes of  34 evaluation reports published in the Sida Evaluations series between 2003 and 2005. It BEGS a follow up study to see if/how these various quality ratings correlate in any way with the subsequent use of the evaluation reports. Could Sida pursuaded to do something like this?

Ethics focused

  • Australasian Evaluation Society

Journal articles

Checklists:

  • Evaluation checklists prepared by the Western Michegan University ,covering Evaluation Management, Evaluation Models, Evaluation Values and Criteria, Metaevaluation, Evaluation Capacity Building / Institutionalization, and Checklist Creation

Other lists:

The American Evaluation Association annual conference: Evaluation Quality

Date: November 10-13, 2010
Venue: San Antonio, Texas

The American Evaluation Association invites evaluators from around the world to attend its annual conference to be held Wednesday, November 10, through Saturday, November 13, 2010 in San Antonio, Texas. We’ll be convening at the lovely Grand Hyatt San Antonio, right in the heart of the vibrant city and adjacent to the Riverwalk’s nightlife, restaurants, and strolling grounds. Discounted hotel reservations will be available in March.

AEA’s annual meeting is expected to bring together approximately 2500 evaluation practitioners, academics, and students, and represents a unique opportunity to gather with professional colleagues in a collaborative, thought-provoking, and fun atmosphere.

The conference is broken down into 44 Topical Strands that examine the field from the vantage point of a particular methodology, context, or issue of interest to the field as well as the Presidential Strand highlighting this year’s Presidential Theme of Evaluation Quality. Presentations may explore the conference theme or any aspect of the full breadth and depth of evaluation theory and practice.

Proposals are due by midnight in the Eastern time zone, on Friday, March 19, 2010.
For more information: http://www.eval.org/eval2010/10cfp.htm

Training: Evaluating the quality of humanitarian projects

Training course – ‘Evaluating the Quality of humanitarian projects, postponed to 19-23 May 2008, Plaisians, France
This 5-day course will take place in the training centre at Groupe URD’s headquarters. The course looks at evaluation (definition, phases, types…) with regard to the Quality of humanitarian action.
It is organised around a full case study based on real experiences in the field.
The course is aimed at Programme managers, Project managers, M&E managers, evaluators and all other positions in the humanitarian sector involved in project management.
Language : French.
Registration form
Contact

%d bloggers like this: