The Child Rights in Practice: Measuring our Impact conference

Date: Date: October 25th – 30th 2009
Venue: Whistler British Columbia.

We are pleased to announce the new dates for Child Rights in Practice Conference: Measuring our Impact. The conference will be held from October 25th – 30th in beautiful Whistler, British Columbia.

This conference promises to be an exciting, interactive journey to discover how we can be more accountable to children. It will explore the current methods of monitoring and evaluation and also introduce new ways to incorporate this practice into our daily work. Continue reading “The Child Rights in Practice: Measuring our Impact conference”

Training in the Most Significant Change Evaluation Technique (Cardiff, UK)

Date: 15th to 17th December 2009
Venue: Cardiff, UK

There are many different ways to collect and analyse data as part of an evaluation. Each has their merits, and each has their weaknesses. Recently there has been an increased recognition that quantitative analysis (using numbers) may not always be appropriate, or give us the full picture. As Einstein said “Not everything that can be counted counts, and not everything that counts can be counted”. The importance of stakeholder participation in evaluation has also gained prominence. A new suite of qualitative evaluation tools have emerged in response to this, the Most Significant Change (MSC) story approach possibly having the greatest prominence.
Continue reading “Training in the Most Significant Change Evaluation Technique (Cardiff, UK)”

Measuring Change II: Expanding Knowledge on Monitoring and Evaluation in Media Development

Date: Monday, 12 October, 2:30pm – Wednesday, 14 October, 2:30pm
Venue: Bad Honnef (near Bonn), GermanVenue: y

Deadline for registration is August 15, 2009

Conference Aim

  • To share latest trends, tools, and learning regarding:
  • Assessing media landscapes and training initiatives
  • Evaluating how media development cooperation may affect media’s potential to change societies
  • Approaches to M&E from project implementer and donor perspectives

Continue reading “Measuring Change II: Expanding Knowledge on Monitoring and Evaluation in Media Development”

Theory-Based Impact Evaluation: Principles and Practice

Howard White June 2009 3ie WORKING PAPER 3

Abstract

Calls for rigorous impact evaluation have been accompanied by the quest not just to find out what works but why. It is widely accepted that a theory –
based approach to impact evaluation, one that maps out the causal chain
from inputs to outcomes and impact and tests the underlying assumptions,
will shed light on the why question. But application of a theory-based
approach remains weak. This paper identifies the following six principles to
successful application of the approach: (1) map out the causal chain
(programme theory); (2) understand context; (3) anticipate heterogeneity;
(4) rigorous evaluation of impact using a credible counterfactual; (5)
rigorous factual analysis; and (6) use mixed methods.

Designing impact evaluations: different perspectives

Robert Chambers, Dean Karlan, Martin Ravallion, and Patricia Rogers
July 2009 3ie WORKING PAPER 4

Preface (see text below)
Howard White, Executive Director, International Initiative for Impact
Evaluation (3ie)

Making the Poor Count: Using Participatory Methods for Impact
Evaluation

Robert Chambers, Institute of Development Studies, University of Sussex

Thoughts on Randomized Trials for Evaluation of Development:
Presentation to the Cairo Evaluation Clinic

Dean Karlan, Yale University and Innovations for Poverty Action/ Jameel
Poverty Action Lab Affiliate

Evaluating Three Stylized Intervention
Martin Ravallion, World Bank

Matching Impact Evaluation Design to the Nature of the
Intervention and the Purpose of the Evaluation

Patricia Rogers, Collaboration for Interdisciplinary Research, Consulting
and Learning in Evaluation, Royal Melbourne Institute of Technology

Preface

Debates on approaches to impact evaluation design appear to have reached an impasse in recent years. An objective of the international conference, Perspectives on Impact Evaluation, March 29th to April 2nd, Cairo, organized by 3ie, NONIE, AfrEA and UNICEF, was to bring together different voices and so work toward a consensus. A key session in this approach was a plenary in which experts from different perspectives were asked how they would approach the evaluation of three interventions: a conditional cash
transfer, an infrastructure project and an anti-corruption program. The motivation for the session was that debates get stuck when they remain at the conceptual level, but that a greater degree of consensus can be achieved once we move to the specifics of the design of a particular evaluation. I am very pleased that the four presenters agreed to write up their views so they can be more widely disseminated.

Thanks are due to Hugh Waddington and Rizwana Siddiqui for assistance in the preparation of this collection.

Howard White
Executive Director, 3ie

American Evaluation Association (AEA) Annual Conference

Date: November 11 – 14, 2009
Venue: Orlando, Florida

The American Evaluation Association (AEA) invites evaluators from around the world to attend its annual conference to be held Wednesday, November 11, through Saturday, November 14, 2009 in Orlando, Florida. We are fortunate to be at the world-class Rosen Shingle Creek Resort, providing us with a beautiful venue and context for learning from one another as well as building in some time for relaxation and fun.

AEA’s annual meeting is expected to bring together approximately 2500 evaluation practitioners, academics, and students, and represents a unique opportunity to gather with professional colleagues in a supportive, invigorating, atmosphere.

The conference is broken down into 41 Topical Strands, which includes 655 sessions over the course of 3.5 days. These sessions examine the field from the vantage point of a particular methodology, context, or issue of interest to the field as well as the Presidential Strand highlighting this year’s Presidential Theme of Context and Evaluation. Presentations may explore the conference theme or any aspect of the full breadth and depth of evaluation theory and practice.

To register for the Evaluation 2009 Conference, please visit our Eval 2009 page (eval.org/eval2009/). However, for additional information about the Conference or AEA in general, please contact Membership Director Heidi Nye via email at info@eval.org or telephone at (888) 232.2275 or (508) 748.3326. Act quickly, as the discounted early registration rates expire September 26!

Identifiying and documenting “Lessons Learned”: A list of references

Editor’s note:

This is a very provisional list of documents on the subject of Lessons Learned, what they are, and how to identify and document them. If you have other documents that you think should be included in this list, please make a comment below.

Note: This is not a list of references on the wider topic of learning, or on the contents of the Lessons Learned.

2014

  • EVALUATION LESSONS LEARNED AND EMERGING GOOD PRACTICES. ILO Guidance Note No.3, April 2014. April 25, 2014 “The purpose of this guidance note is to provide background on definitions and usages of lessons learned applied by the ILO Evaluation Unit. Intended users of this guidance note are evaluation managers and any staff in project design or technically backstopping the evaluation process. There is separate guidance provided for  consultants on how to identify, formulate and present these findings in reports”

2012

2011

  • The NATO Lessons Learned Handbook. Second Edition, September 2011 “Human beings, who are almost unique in having the ability to learn from the experience of others, are also remarkable for their apparent disinclination to do so.” – Douglas Adams, author of The Hitchhiker’s Guide to the Galaxy”

2009

2007

  • Lessons Learned from Evaluation M. J. Spilsbury, C. Perch, S. Norgbey, G. Rauniyar and C.Battaglino Special Study Paper Number 2 A Platform for Sharing Knowledge. United Nations Environment Programme. January,2007. Lessons presented in evaluation reports are often of highly variable quality and limited utility. They are “often platitudes borne of a felt need to demonstrate engagement in the ‘knowledge society’ or simply to satisfy the specified evaluation requirements”. Even where high quality lessons are developed, they are seldom communicated effectively to their intended audiences. In order to enhance the quality of lessons, improve their utilisation, and aid their dissemination and communication, a Framework of Lessons from evaluation is presented in this paper. The framework consists of common problems, issues and or constraints to which evaluation lessons relate using ‘Mind- mapping’ software and ‘problem tree’ techniques. Evaluation lessons were systematically classified within the resulting Framework of Lessons. The proposed framework of evaluation lessons is best used within the context of interactive ‘face-to-face’ communication with project / programme managers to ensure that evaluation lessons truly become ‘lessons learned’.

2005

2004

  • Criteria for Lessons Learned (LL) A Presentation for the 4th Annual CMMI Technology Conference and User Group , by  Thomas R. Cowles Raytheon Space and Airborne Systems Tuesday, November 16, 2004

2001

  • M. Q. Patton (2001) Evaluation, Knowledge Management, Best Practices, and High Quality Lessons Learned American Journal of Evaluation, 22(3), 2001. Abstract:  Discusses lessons to be learned from evaluation and best practices in evaluation and some ways to bring increased rigor to evaluators’ use of those terms. Suggests that “best” practices is a term to avoid, with “better” or “effective” being more realistic, and calls for more specificity when discussing lessons to be derived. (full text not yet found on line)

1997

If you know of other relevant documents and web pages, please tell us, by using the Comment facility below

International Course on Participatory Planning, Monitoring and Evaluation – Navigating and Managing for Impact

Date: 1 Mar 2010 – 19 Mar 2010
Venue: Wageningen, Netherlands

Dear colleagues,

I would like to refer you to the succesful course that we are running since 2002 and which this year had 3 parallel courses due to the high number of applications!

This course is organised by Wageningen International, part of Wageningen University and Research Centre. This course focuses on how to manage for impact by integrating strategic guidance, operational management, monitoring and evaluation in a learning environment, whilst navigating the external and internal context. Particular attention is given to designing and institutionalising participatory planning and M&E systems in development initiatives and organisations for continuous learning and enhancing performance. Attention also is paid to the relationship between management information needs and responsibilities and the planning and M&E functions. For more info please visit our website: http://www.cdic.wur.nl/UK/newsagenda/agenda/Participatory_planning_monitoring_and_evaluation.htm or contact us: training.wi@wur.nl or cecile.kusters@wur.nl

If your are interested to receive scholarship – the deadline is 1st September so please be fast!

Kind regards / Hartelijke groeten,

Cecile Kusters
Participatory Planning, Monitoring & Evaluation
Multi-Stakeholder Processes and Social Learning
Wageningen UR, Wageningen International
P.O.Box 88, 6700 AB Wageningen, the Netherlands
Visiting address: Lawickse Allee 11,Building 425, 6701 AN Wageningen, The Netherlands
Tel.  +31 (0)317- 481407
Fax. +31 (0)317- 486801

e-mail:  cecile.kusters@wur.nl
Website: www.cdic.wur.nl/UK
PPME resource portal: http://portals.wi.wur.nl/ppme/
MSP resource portal: http://portals.wi.wur.nl/msp/
www.disclaimer-uk.wur.nl

Evaluation of Conflict Sensibility, Conflict Prevention and Peace Building Programmes

Date: 5-8 October 2009
Venue: Belgium

This annual course is an intermediate- to advanced level course based on the newest guidelines established by the OECD-DAC. It provides methodologies for carrying out assessments of conflict sensibility, conflict situations and, subsequently, evaluating the performance of peace-building and conflict prevention activities in a seminar format with focus on methods and challenges. The course is intended for those with experience in evaluations, and an interest in, and general experience of, conflict situations.

Based on Channel Research’s experience of running training programmes on evaluation, the participants in previous years have come from aid agencies (headquarters and field personnel), donor governments, consultancies and academia. This 4 days (5 nights) course is facilitated by Emery Brusset, Director of Channel Research, Tony Vaux, an expert on conflicts and Koenraad Denayer, expert in conflict sensibility and will take place at Orshof (www.orshof.be) near Brussels.

Please find attached the course outline and application form or on the link: http://www.channelresearch.com/peace-building/evaluation-of-peace-building. For any further information, please contact Maria Bak on bak@channelresearch.com.

You can find more information about Channel Research and our trainings on: www.channelresearch.com

Utilization-focused evaluation for agricultural innovation

Michael Quinn Patton and Douglas Horton
ILAC Brief No 22

Utilization-focused evaluation (UFE) is based on the principle that an evaluation should be judged by its utility. So no matter how technically sound and methodologically elegant, an evaluation is not truly a good evaluation unless the findings are used. UFE is a framework for enhancing the likelihood that evaluation findings will be used and lessons will be learnt from the evaluation process. This Brief, based on the book Utilization-focused evaluation, introduces this approach to evaluation, outlines key steps in the evaluation process, identifies some of the main benefits of UFE, and provides two examples of UFE in the context of programmes aimed at promoting agricultural innovation.

%d bloggers like this: