2010 Annual Report on Results and Impact of IFAD Operations

“The IFAD Office of Evaluation (IOE) has released its eighth Annual Report on Results and Impact of IFAD Operations (ARRI), based on evaluations carried out in 2009. The main objectives of the report are to highlight the results and impact of IFAD-funded operations, and to draw attention to systemic issues and lessons learned with a view to further enhancing the Fund’s development effectiveness. The report synthesizes results from 17 projects evaluated in 2009 and draws upon four country programme evaluations in Argentina, India, Mozambique and Niger, as well as two corporate-level evaluations, namely on innovation and gender.

This year’s ARRI found that the performance of past IFAD-supported operations is, on the whole, moderately satisfactory. However, performance of operations has improved over time in a number of areas (e.g. sustainability, IFAD’s own performance as a partner, and innovation), even though additional improvements can be achieved in the future. The ARRI also found that projects designed more recently tend to perform better than older-generation operations, as overall objectives and design are more realistic and they devote greater attention to results management.

There are areas that remain a challenge, such as efficiency, and natural resources and environment.  The ARRI also notes that IFAD can do more to strengthen government capacity, which is one of the most important factors in achieving results on rural poverty. With regard to efficiency, the ARRI notes that the efficiency of IFAD-funded projects has improved from a low base in 2004/5, even though there is room for further enhancement. Efficiency will be studied by IOE in much deeper detail in 2011, within the framework of the planned corporate level evaluation on the topic.

The report is available on the IFAD website, at the following address: http://www.ifad.org/evaluation/arri/2010/arri.pdf

Hard copies of the report are available with IOE and can be requested via e-mail to evaluation@ifad.org.

Cordial regards,

IOE Evaluation Communication Unit

For further information, please contact:
Mark Keating
Evaluation Information Officer
Office of Evaluation
Tel: ++39-06-54592048
e-mail: m.keating@ifad.org



“The DAC Network on Development Evaluation is a unique international forum that brings together evaluation managers and specialists from development co-operation agencies in OECD member countries and multilateral development institutions. Its goal is to increase the effectiveness of international development programmes by supporting robust, informed and independent evaluation.

A key component of the Network’s mission is to develop internationally agreed norms and standards to strengthen evaluation policy and practice. Shared standards contribute to harmonised approaches in line with the commitments of the Paris Declaration on Aid Effectiveness. The body of norms and standards is based on experience, and evolves over time to fit the changing aid environment. These principles serve as an international reference point, guiding efforts to improve development results through high quality evaluation.

The norms and standards summarised here should be applied discerningly and adapted carefully to fit the purpose, object and context of each evaluation. This summary document is not an exhaustive evaluation manual. Readers are encouraged to refer to the complete texts available on the DAC Network on Development Evaluation’s website: www.oecd.org/dac/evaluationnetwork. Several of the texts are also available in other languages.”


The DAC Network on Development Evaluation,  OECD, 2010. Download a pdf copy


In June 2009, the Organisation for Economic Co-operation and Development (OECD) Development Assistance Committee (DAC) Network on Development Evaluation agreed to undertake a study of its members’ evaluation systems and resources. The study aims to take stock of how the evaluation function is managed and resourced in development agencies and to identify major trends and current challenges in development evaluation. The purpose is to inform efforts to strengthen evaluation systems in order to contribute to improved accountability and better development results. It will be of interest to DAC members and evaluation experts, as well as to development actors in emerging donor and partner countries.

To capture a broad view of how evaluation works in development agencies, core elements of the evaluation function are covered, including: the mandate for central evaluation units, the institutional position of evaluation, evaluation funding and human resources, independence of the evaluation process, quality assurance mechanisms, co-ordination with other donors and partner countries, systems to facilitate the use of evaluation findings and support to partner country capacity development.

This report covers the member agencies of the OECD DAC Network on Development Evaluation.1 See Box 1 for a full list of member agencies and abbreviations. Covering all major bilateral providers of development assistance and seven important multilateral development banks, the present analysis therefore provides a comprehensive view of current policy and practice in the evaluation of development assistance.

The study is split into two sections: section I contains an analysis of overall trends and general practices, drawing on past work of the DAC and its normative work on development evaluation. Section II provides an individual factual profile for each member agency, highlighting its institutional set-up and resources.”

Analyzing the Effects of Policy Reforms on the Poor: An Evaluation of the Effectiveness of World Bank Support to Poverty and Social Impact Analyses

World Bank, 2010

“The World Bank introduced the Poverty and Social Impact Analysis (PSIA) approach in fiscal 2002 to help governments and the Bank anticipate and address the possible consequences of proposed policy reforms, especially on the poor and vulnerable, and to contribute to country capacity for  policy analysis. By fiscal 2007 the Bank had undertaken 156 pieces of analytical work using one or more elements of the PSIA approach (hereafter called PSIAs) in 75 countries and 14 sectors. Total donor support to PSIAs over fiscal 2004–06 was $15 million, which came from the Bank’s earmarked Incremental Fund for PSIAs ($5.8 million), earmarked PSIA Trust Funds contributed by various bilateral donors, and non-earmarked Bank budget and other donor funding.”…

“Although  the Bank has  submitted progress reports  to donors  regarding  the  implementation  of  PSIAs,  it  has  not  yet  completed a  comprehensive  self-evaluation  of  the PSIA  experience.  This  evaluation  by  the Independent  Evaluation Group,  requested by  the Bank’s Board of Executive Directors, represents the first independent evaluation of the PSIA experience.”

Full text available online

New Handbook on Planning, Monitoring and Evaluating for Development Results

We are pleased to inform you that the new Handbook on Planning, Monitoring and Evaluating for Development Results was launched by the Administrator on September 15th, 2009. The Evaluation Office, the Operations Support Group, and the Capacity Development Group of the Bureau for Development Policy collaborated in revising the publication. Please click here to view the video clip of the launch, which was followed by a conversation between the Administrator and Mr. Bruno Pouezat, Resident Representative in Azerbaijan, on the importance of managing for development results in UNDP.
This Handbook is different from previous versions.  Recognizing the importance of integrating results-based management at the design stage, this version includes a section on planning. As the Administrator emphasizes in her letter, working consciously towards results requires systematic planning, monitoring and evaluation. The Handbook is also intended to help you support national capacities in these areas in close collaboration with national counterparts and institutions.

You will soon receive printed copies of the Handbook in English, Spanish and/or French. You can also access the online version of the Handbook at www.undp.org/eo/handbook.

We hope that UNDP will find this publication useful in its effort to be a more effective partner for development.

Best regards,

Saraswathi Menon, Director, Office of Evaluation |Judith Karl, Director, Operations Support Group, Executive Office |Kanni  Wignaraja, Director, Capacity Development Group, Bureau for Development Policy |

Training in Evaluation of Humanitarian Action

Date: 21st-24th June 2009
Venue: Belgium

Channel Research and the Active Learning Network for Accountability and Performance (ALNAP) are inviting participants for Training in Evaluation of Humanitarian Action, Belgium, 21st-24th June 2009 (actual training dates 22nd-24th June 2009).

This course is an introductory-to-intermediate level course and has the overall aim of assisting participants in the design of monitoring systems, and to be able to commission, manage, carry out and use small scale evaluations in humanitarian action. This 3-day training course will use the OECD-DAC evaluation criteria but also introduces new evaluation material specifically on joint evaluations and innovative learning processes as part of an evaluation process.

Continue reading “Training in Evaluation of Humanitarian Action”

Review of results-based management at the United Nations

A >Report< of the Office of Internal Oversight Services.  September 2008

“Results-based management at the United Nations has been an administrative chore of little value to accountability and decision-making”


Results-based management involves focusing on what occurs beyond the process of translating inputs into outputs, namely outcomes (or “expected accomplishments”) to which it seeks to bring accountability. An inherent constraint of results-based management is that a formalistic approach to codifying how to achieve outcomes can stifle the innovation and flexibility required to achieve those outcomes.

The Office of Internal Oversight Services (OIOS) finds that the introduction of results-based management in the Secretariat has been dealt with as an addition to the myriad rules and procedural requirements that govern inputs, activities, monitoring and reporting. It has not been accompanied by any relaxation of the volume, scope or detail of regulatory frameworks pertaining to financial, programmatic and human resource management. For each of these, there are separate and incompatible systems, rules and regulations. Continue reading “Review of results-based management at the United Nations”


Progress of the World’s Women 2008/2009. Full Report. Published by United Nations Development Fund for Women.


Progress of the World’s Women 2008/2009: Who Answers to Women? Gender and Accountability shows that realising women’s rights and achieving the Millennium Development Goals depends on strengthening accountability for commitments to women and gender equal-ity. The examples highlighted throughout the Report suggest that for women’s rights to translate into substantive improvements in their lives, and for gender equality to be realized in practice, women must be able to fully participate in public decision-making at all levels and hold those responsible to account when their rights are infringed or their needs ignored. Published at the halfway point to the 2015 dead-line for achieving the MDGs, Progress presents clear evidence that women’s empowerment and gender equality are drivers for reducing poverty, building food security, reducing maternal mortality, and enhancing the effectiveness of aid.

The chapters in this volume examine how women’s efforts to ex-pose gender-based injustice and demand redress have changed the ways in which we think about accountability. Acknowledging that different groups of women encounter distinct challenges in gaining access to their rights, Progress 2008/2009 highlights a wide range of examples, including those that show how the most excluded women are identifying accountability gaps and calling for redress.

Improving accountability to women begins with increasing the numbers of women in decision-making, but it cannot stop there. It requires stronger mandates, clearer performance indicators, better incentives and sustained advocacy efforts – in short, good governance. Progress 2008/2009 shows that good governance needs women and women need good governance if commitments to gender equality are to be met nationally and globally

Evaluation of the Implementation of the Paris Declaration

Thematic study: The applicability of the Paris Declaration in fragile and conflict-affected situations

Executive Summary

The September 2008 DAC HLF in Accra provides an opportunity to discuss the challenges of applying the Paris Declaration in fragile and conflict-affected situations. This report aims to provide evidence to inform these discussions by:
• Synthesising existing evidence on the aid effectiveness and state-building challenges faced in fragile and conflict-affected situations;
• Exploring the relevance and application of the Paris Declaration and the Fragile States principles in different contexts of fragility and conflict; and
• Setting out the key challenges to improving effective engagement by development partners in fragile situations.

This paper is based on a review of the primary and secondary literature. As part of the review,
four country case studies (Afghanistan, Burundi, the DRC and Nepal) were carried out. These are
included as annexes to the report.
Continue reading “Evaluation of the Implementation of the Paris Declaration”

Monitoring and Evaluation for Results – Vienna, Austria

URL: www.worldbank.org/wbi/evaluation/training/vienna2008
Date: December 2008

Join us in December 2008 in Vienna for a one-week course covering the fundamental issues of monitoring and evaluation (M&E) for programs and projects. The course is organized by the World Bank Institute Evaluation Group – with experience teaching over 4,000 participants across 60 countries in the last decade – and taught by senior evaluation experts. Combining both attention to M&E theory and practical applications, the 30 participants completing the course will gain knowledge and experience on M&E tools, techniques, and resources needed for M&E planning, organizing, and/or managing programs and projects.

1st offering: December 8 (Mon) – 12 (Fri), 2008 (available in English or Russian)
2nd offering: December 15 (Mon) – 19 (Fri), 2008 (available in English only)

%d bloggers like this: