On evaluation quality standards: A List


The beginnings of a list. Please suggest others by using the Comment facility below

Normative statements:

Standards for specific methods (and fields):


  • Are Sida Evaluations Good Enough?An Assessment of 34 Evaluation Reports” by Kim Forss, Evert Vedung, Stein Erik Kruse,Agnes Mwaiselage, Anna Nilsdotter, Sida Studies in Evaluation 2008:1  See especially Section 6: Conclusion, 6.1 Revisiting the Quality Questions, 6.2 Why are there Quality Problems with Evaluations?, 6.3 How can the Quality of Evaluations be Improved?, 6.4 Direction of Future Studies. RD Comment:  This study has annexes with empirical data on the quality attributes of  34 evaluation reports published in the Sida Evaluations series between 2003 and 2005. It BEGS a follow up study to see if/how these various quality ratings correlate in any way with the subsequent use of the evaluation reports. Could Sida pursuaded to do something like this?

Ethics focused

  • Australasian Evaluation Society

Journal articles


  • Evaluation checklists prepared by the Western Michegan University ,covering Evaluation Management, Evaluation Models, Evaluation Values and Criteria, Metaevaluation, Evaluation Capacity Building / Institutionalization, and Checklist Creation

Other lists:

Evaluation Revisited – Improving the Quality of Evaluative Practice by Embracing Complexity

Utrecht Conference Report. Irene Guijt, Jan Brouwers, Cecile Kusters, Ester Prins and Bayaz Zeynalova. March 2011. Available as pdf

This report summarises the outline and outputs of the Conference ‘Evaluation Revisited: Improving the Quality of Evaluative Practice by Embracing Complexity’’, which took place on May 20-21, 2010. It also adds additional insights and observations related to the themes of the conference, which emerged in presentations about the conference at specific events.

Contents (109 pages):

1 What is Contested and What is at Stake
1.1 Trends at Loggerheads
1.2 What is at Stake?
1.3 About the May Conference
1.4 About the Report
2 Four Concepts Central to the Conference
2.1 Rigour
2.2 Values
2.3 Standards
2.4 Complexity
3 Three Questions and Three Strategies for Change
3.1 What does ‘evaluative practice that embraces complexity’ mean in practice?
3.2 Trade-offs and their Consequences
3.3 (Re)legitimise Choice for Complexity
4 The Conference Process in a Nutshell



“The DAC Network on Development Evaluation is a unique international forum that brings together evaluation managers and specialists from development co-operation agencies in OECD member countries and multilateral development institutions. Its goal is to increase the effectiveness of international development programmes by supporting robust, informed and independent evaluation.

A key component of the Network’s mission is to develop internationally agreed norms and standards to strengthen evaluation policy and practice. Shared standards contribute to harmonised approaches in line with the commitments of the Paris Declaration on Aid Effectiveness. The body of norms and standards is based on experience, and evolves over time to fit the changing aid environment. These principles serve as an international reference point, guiding efforts to improve development results through high quality evaluation.

The norms and standards summarised here should be applied discerningly and adapted carefully to fit the purpose, object and context of each evaluation. This summary document is not an exhaustive evaluation manual. Readers are encouraged to refer to the complete texts available on the DAC Network on Development Evaluation’s website: www.oecd.org/dac/evaluationnetwork. Several of the texts are also available in other languages.”

UNICEF Evaluation Report Standards

Evaluation Office, UNICEF NYHQ , September 2004

>The UNICEF Evaluation Report Standards< have been created as a transparent tool for quality assessment of evaluation reports.  This document outlines what the Standards are, the rationale for each standard and how they are applied.  The Standards are used by UNICEF Evaluation Office to assess
evaluations for inclusion in the organisation’s Evaluation and Research Database to strengthen the Database as a learning tool.  Application of Standards will also provide feedback to UNICEF Country Offices on how the evaluation is seen by someone outside of the evaluation process.

The Standards are also intended for use by UNICEF offices and partners commissioning evaluations to establish the criteria against which the final report will be assessed.  The UNICEF Evaluation Report Standards draw from and are complementary to key references on standards in evaluation design and process increasingly adopted in the international evaluation community

Pilot training course on the DCED Standard for results measurement

Date: 7-11 September 2009
Venue: Thailand,

In response to growing demands to measure results, the DCED (Donor Committee for Enterprise Development) has been working with PSD (Private Sector Development) programmes in the field to define minimum elements for credible results measurement. These elements include:

  • articulation of the results chain(s) or logic;
  • definition of appropriate indicators;
  • good measurement practice;
  • attribution;
  • capturing wider changes in the market; and
  • relating results to costs.

Continue reading “Pilot training course on the DCED Standard for results measurement”

Discussion of the use of the OECD/DAC Criteria for International Development Evaluations

Journal of MultiDisciplinary Evaluation
Vol 5, No 9 (2008)

The OECD/DAC Criteria for International Development Evaluations: An Assessment and Ideas for Improvement PDF Thomaz Chianca 41-51

An Association to Improve Evaluation of Development Aid PDF
Paul Clements 52-62

Commentary on “An Association to Improve Evaluation of Development Aid” PDF
Hellmut Eggers 63-69

Reply to Hellmut Eggers’ Comments on “An Association to Improve Evaluation of Development Aid” PDF
Paul Clements 70-73

(These articles were mentioned today by Ian Patrick, Melbourne, Australia in his post on the MandE NEWS email list)

Workshop: Standards & Practices in Evaluating Development

Date: August 3-8, 2008
Venue: University of Bamako, Mali

Organizer: Association pour la Promotion de l’Evaluation au Mali (APEM) – Bamako – Mali

Mamadou Keita – President (mkeita@delta- c.org)

Workshop Coordinator: Ahmed Ag Aboubacrine, DME Coordinator, CARE International in Sierra Leone,

ahmed1996@yahoo. fr

This event is part of the 5th Mali Symposium on Applied Sciences, to be held at the University of Bamako from August 3-8, 2008.


Monitoring and evaluation of development intervention and policies’ implementation has become an absolute necessity after the setting of international norms of evaluation practices by donors as well as private sector, governments and local constituencies.

Evaluation has not become only a cross-cutting academic area (studied and subject of scientific researches) but also a basic requirement in almost all sectors: health, education, finance, infrastructure, social, agriculture, livestock, water and sanitation, urban planning, habitat, HIV/AIDS, transport, gender, corruption, governance.

Besides the standards set by independent organizations such as African Evaluation Association (www.afrea.org) and also IDEAS (http://www. ideas-int. org/ ); there are other specific evaluation mechanisms:

* African Peer Review Mechanism (APRM)

* Governance Index of Mo Ibrahim foundation

* Poverty Reduction Strategies (PRS) National Evaluation Mechanisms supported by World Bank, African Development Bank, UNDP and other UN agencies.

Each of the above mentioned organizations has its norms and standards in monitoring and evaluation which are most of them very similar.

This mini-symposium objective is to promote monitoring and evaluation practices in Mali in order to ensure the effectiveness, efficiency, sustainability and impact of development interventions undertaken by the state, local constituencies, donors, NGOs and private sector.

The specific objectives are three:

* Sharing of existing evaluation standards (state-of-art)

* Exchanging on current practices of evaluation in Mali

* Finding out the strategies for generalization / institutionalization of evaluation practice by state actors

Submission of papers

Papers should be submitted in an A4 format and should not exceed four pages. All papers related to the standards and practices of evaluation in any area could be accepted.

For more details, visit our website:

http://www.msas. maliwatch. org/msas2008/ msas2008_ 018.html

%d bloggers like this: