IPEN Conference on “Ensuring effective performance through evaluation”

Date: 25-27 September 2008
Venue: Yerevan, Armenia

(via Xceval email list)

Dear colleagues,

we have the pleasure to inform you that the 2008 IPEN (International
Programme Evaluation network) Conference will take place in Yerevan,
Armenia, September 25/27. IPEN is the Evaluation network covering the
countries of the Commonwealth of Independent States (CIS).

The 2008 Conference theme is “Ensuring effective performance through
evaluation”. Fourteen parallel sessions facilitated by international-level
specialists from the region and worldwide are foreseen, as well as three
one-day Pre-Conference training.

The Conference programme, including the pre-Conference training, is now
available on-line (English and Russian) at
http://eval-net.org/view_konf.php?id=2008

Best regards

Marco Segone
Regional Chief, Monitoring and Evaluation
UNICEF Regional Office for CEE/CIS
Tel: +41 (0)22 909 5550
Fax: +41 (0)22 909 5909
Email: msegone@unicef.org
Web: www.unicef.org/ceecis

Producing Social Accountability? The Impact of Service Delivery Reforms

Source: Joshi, A., 2008, ‘Producing Social Accountability? The Impact of Service Delivery Reforms’,
IDS Bulletin, Volume 38, Number 6, pp. 10-17(8)
Author: Institute of Development Studies , http://www.ids.ac.uk/ids
Summary (from GSDRC website)

Which types of state reform improve public services and citizen engagement? How can accountability mechanisms improve service delivery? This Institute of Development Studies (IDS) paper draws on the polity approach, which suggests that the organisation of state institutions influences who engages in collective action and around what issues. Collective action is essential for the poor if direct accountability is to work. Successful cases of social accountability are often the result of alliances that cut across class and public-private divides. Continue reading “Producing Social Accountability? The Impact of Service Delivery Reforms”

Discussion of the use of the OECD/DAC Criteria for International Development Evaluations

Journal of MultiDisciplinary Evaluation
Vol 5, No 9 (2008)

The OECD/DAC Criteria for International Development Evaluations: An Assessment and Ideas for Improvement PDF Thomaz Chianca 41-51

An Association to Improve Evaluation of Development Aid PDF
Paul Clements 52-62

Commentary on “An Association to Improve Evaluation of Development Aid” PDF
Hellmut Eggers 63-69

Reply to Hellmut Eggers’ Comments on “An Association to Improve Evaluation of Development Aid” PDF
Paul Clements 70-73

(These articles were mentioned today by Ian Patrick, Melbourne, Australia in his post on the MandE NEWS email list)

Measuring Effectiveness – Participation, Empowerment and Downward Accountability

Date: 25th-26th September
Venue: Melbourne

Greetings

The annual Measuring Effectiveness conference is now only 6 weeks away. Online registrations are open on the website : https://www.worldvision.com.au/learn/conferences/me/Register.aspx

The conference will be held on Thursday 25th and Friday 26th September, 2008 in Melbourne, Australia. This year the conference is being held in partnership with The Australian National University. The 2008 conference will explore the themes of “Participation, Empowerment and Downward Accountability.”

The sessions will include case studies & panel discussions on these themes as well as looking at geographical activities in Asia, the Pacific and Latin America/Caribbean.
The draft conference schedule and session summaries will be available on the conference website within the next few days.
Continue reading “Measuring Effectiveness – Participation, Empowerment and Downward Accountability”

Customising definitions of outputs, outcomes and impact

(from the OM email list)
I am continually challenged with the organisations with whom I work to define the terms we will use. They are by and large Northern donors and their grantees – development NGOs and social change networks. I find that the meaning of outcome and impact (and output) varies considerably and sometimes the terms are used interchangeably. Thus, one of the biggest mistakes I can make is not to define the terms right from the start.

In a recent discussion titled Outcomes vs. Impact on the American Evaluation Association listserv EVALTALK, I shared a simple instrument that I use for developing common agreement about the definitions of outputs, outcomes and impact. It can be found at:
http://www.outcomemapping.ca/resource/resource.php?id=189

Hope it is helpful to some of you planners, monitors, evaluators and those who employ them.

If you are interested in the AEA EVALTALK listserv, subscribe at www.eval.org. To use the archives, go to this web site: http://bama.ua.edu/archives/evaltalk.html. Please note that the discussion I refer to will only be archived in September.

Best wishes,
Ricardo

ricardo wilson-grau consulting
NEW: Oude Singel 184, 2312 RH Leiden, Netherlands
Rua Marechal Marques Porto 2/402, Tijuca, Rio de Janeiro, CEP 20270-260, Brasil
Tel: 55 21 2284 6889, Skype: ricardowilsongrau

Evaluation of Humanitarian Action with ALNAP (Active Learning Network for Accountability and Performance in Humanitarian Action)

Date: 29-31 OCTOBER 2008
Venue: near Brussels

This course is an introductory-to- intermediate level course and has the overall aim of making evaluations of humanitarian action more effective in contributing to the improved performance of interventions and to improve the quality of the evaluation process. The course is facilitated by Channel Research, Margie Buchanan Smith and John Cosgrave. It will be held near Brussels (La Converserie).

Download Course Description | Download Application Form

Training in Evaluation of Conflict Prevention and Peacebuilding Programmes

Date: 15-18 SEPTEMBER 2008
Venue: near Brussels

This course based on new guidelines established by the OECD, provides methodologies for carrying out assessments of conflict situations and, subsequently, evaluating the performance of peace-building and conflict prevention activities in a seminar format. The course is intended for those with experience in evaluations, and an interest in, and general experience of, conflict situations. The participants in previous years have come from both aid agencies (headquarters and field personnel), donor governments, consultancies and academia. The course is facilitated by Emery Brusset, Director of Channel Research, and Tony Vaux, an expert on conflicts, and will take place near Brussels.

Download Course Description | Download Application Form

Above information provided by

Annina Mattsson
Area Manager

Channel Research
Route des Marnières 45A
1380 Ohain

Tel: +32 2 6336529
Fax: +32 2 6333092
Mobile: +32 473235244
www.channelresearch.comDate:

Reducing World Poverty by Improving Evaluation of Development Aid

Paul Clements, Western Michigan University, clements@wmich.edu, Thomaz Chianca, Western Michigan University, Ryoh Sasaki, Western Michigan University

American Journal of Evaluation, Vol. 29, No. 2, 195-214 (2008)

Abstract: This article argues that given its structural conditions, development aid bears a particularly heavy burden of learning and accountability. Unfortunately, however, the organization of evaluation guarantees that evaluations will be inconsistent and it creates incentives for positive bias. This article presents evidence from organizational studies of aid agencies, from the public choice literature, from eight development projects in Africa, and from one in India, that demonstrates positive bias and inconsistency in monitoring and evaluation. It proposes that the evaluation function should be professionalized through an approach titled “monitoring and evaluation for cost effectiveness,” and it compares this approach to the World Bank’s results-based monitoring and evaluation, the Development Assistance Committee’s five evaluation criteria, and evaluations based on randomized trials. This article explains the analytics of the proposed approach and suggests directions for further research.

Potential Human Rights Uses of Network Analysis and Mapping

>A report< to the Science and Human Rights Program of the American Association for the Advancement of Science Skye Bender-deMoll April 28, 2008. 47 pages

Abstract: This report investigates potential new tools and existing applications of network analysis and network mapping to assist or facilitate human rights work. It provides a very brief overview of some network concepts, quick introductions to a number of relevant fields of research, and some specific examples of how people are currently using network tools for academic and applied work. The examples serve as an overview and entry point to the research areas. As this is a developing and fragmented field, classification is difficult. However, some common points exist and a few conclusions are presented. Some of the risks and challenges of network research are discussed, along with criteria for evaluating potential future projects. Finally, several possible projects are proposed.

A Handbook Of Data Collection Tools: Companion To “A Guide To Measuring Advocacy And Policy”

Author: Jane Reisman, Anne Gienapp, and Sarah Stachowiak
Publisher: Organizational Research Services
Publication Date: 2007

Abstract
What are examples of data collection tools for evaluating advocacy?

This handbook of tools is a companion to ORS’ “A Guide To Measuring Advocacy And Policy”. The data collection tools included in the handbook have actually been used to evaluate advocacy or related efforts. The data collection instruments apply to six outcomes areas:

* Shifts in Social Norms;
* Strengthened Organizational Capacity;
* Strengthened Alliances;
* Strengthened Base of Support;
* Improved Policies; and
* Changes in Impact.

%d bloggers like this: