Updated MSC bibliography

PLEASE NOTE. The bibliography below has now been superseded by a more comprehensive bibliography here. This now includes pdf copies of many of the papers plus a search facility. It will continue to be updated

This (now older) page is intended to provide  an update of the bibliography in the 2005 Most Significant Change technique (MSC) Users Guide

Please feel free to suggest additions to this list, through the Comment facility below, or by emailing the editor (Rick Davies)

Papers

 

Powerpoints

  • Seven sets of slides, used for 2 day MSC training in Delhi, 2008 by Rick Davies . Available on request , on condition of willingness to share any adaptations made

YouTube video

Other

 

Guidance on Terms of Reference for an Evaluation: A List

This is the beginning of a new page that will list various sources of guidance on the development of Terms of Reference for an evaluation.

If you have suggestions for any additions (or edits) to this list please use the Comment function below.

Please also see the hundreds of examples of actual ToRs (and related docs) in the MandE NEWS Jobs Forum

PS: Jim Rugh has advised me (5 June 2010) that “two colleagues at the Evaluation Center at Western Michigan University are undertaking an extensive review of RFPs / ToRs they’ve seen posted on various listservs; they intend to publish a synthesis, critique and recommendations for criteria to make them more realistic and appropriate.

Card sorting methods: A List

Card / pile sorting is a simple and useful means of eliciting and aggregating qualitative data, in a participatory manner. In anthropology, it is described as pile sorting, and is used for domain analysis, in the field of cognitive anthropology. In website design it is known as card sorting.

Anthropology
Website design
Software
  • OptimalSort: Online card sorting software (free and paid for use):
  • SynCapsV2:  For the analysis of the results of physical card sorts, which can be downloaded and used on a desktop/laptop
  • XSort: is a free card sorting application for Mac, aimed at user experience professionals and social scientists.
  • KardSort : Perform Web-Based Cardsort Study for free.
  • Miro Card Sorting template

 

Identifiying and documenting “Lessons Learned”: A list of references

Editor’s note:

This is a very provisional list of documents on the subject of Lessons Learned, what they are, and how to identify and document them. If you have other documents that you think should be included in this list, please make a comment below.

Note: This is not a list of references on the wider topic of learning, or on the contents of the Lessons Learned.

2014

  • EVALUATION LESSONS LEARNED AND EMERGING GOOD PRACTICES. ILO Guidance Note No.3, April 2014. April 25, 2014 “The purpose of this guidance note is to provide background on definitions and usages of lessons learned applied by the ILO Evaluation Unit. Intended users of this guidance note are evaluation managers and any staff in project design or technically backstopping the evaluation process. There is separate guidance provided for  consultants on how to identify, formulate and present these findings in reports”

2012

2011

  • The NATO Lessons Learned Handbook. Second Edition, September 2011 “Human beings, who are almost unique in having the ability to learn from the experience of others, are also remarkable for their apparent disinclination to do so.” – Douglas Adams, author of The Hitchhiker’s Guide to the Galaxy”

2009

2007

  • Lessons Learned from Evaluation M. J. Spilsbury, C. Perch, S. Norgbey, G. Rauniyar and C.Battaglino Special Study Paper Number 2 A Platform for Sharing Knowledge. United Nations Environment Programme. January,2007. Lessons presented in evaluation reports are often of highly variable quality and limited utility. They are “often platitudes borne of a felt need to demonstrate engagement in the ‘knowledge society’ or simply to satisfy the specified evaluation requirements”. Even where high quality lessons are developed, they are seldom communicated effectively to their intended audiences. In order to enhance the quality of lessons, improve their utilisation, and aid their dissemination and communication, a Framework of Lessons from evaluation is presented in this paper. The framework consists of common problems, issues and or constraints to which evaluation lessons relate using ‘Mind- mapping’ software and ‘problem tree’ techniques. Evaluation lessons were systematically classified within the resulting Framework of Lessons. The proposed framework of evaluation lessons is best used within the context of interactive ‘face-to-face’ communication with project / programme managers to ensure that evaluation lessons truly become ‘lessons learned’.

2005

2004

  • Criteria for Lessons Learned (LL) A Presentation for the 4th Annual CMMI Technology Conference and User Group , by  Thomas R. Cowles Raytheon Space and Airborne Systems Tuesday, November 16, 2004

2001

  • M. Q. Patton (2001) Evaluation, Knowledge Management, Best Practices, and High Quality Lessons Learned American Journal of Evaluation, 22(3), 2001. Abstract:  Discusses lessons to be learned from evaluation and best practices in evaluation and some ways to bring increased rigor to evaluators’ use of those terms. Suggests that “best” practices is a term to avoid, with “better” or “effective” being more realistic, and calls for more specificity when discussing lessons to be derived. (full text not yet found on line)

1997

If you know of other relevant documents and web pages, please tell us, by using the Comment facility below

MandE NEWS email List

If you want to talk with others about monitoring and evaluation then join the M&E NEWS email list. (< click this link). It has more than 2200+ members worldwide, and is growing in size every day. You can access monthly summaries of the 2007 and 2008  postings here.

A list of M&E email lists

Please Note:

  • If you want to add a new M&E email list, please use the Comment facility at the bottom of this page.
  • If you want to join any of these email lists, clock on the link for that list (Don’t use the Comment facility)

General purpose lists

  • MandENEWS
    2600+ Members, Archives: Membership required. The MandE NEWS mailing list is part of the MandE NEWS website at www.mande.co.uk . Visitors to the website are invited to use the mailing list to exchange information with each other about monitoring and evaluation issues, especially as they relate to international development aid. The Editor of MandE NEWS will also use the mailing list to inform list members about significant updates to the MandE NEWS website. The MandE NEWS mailing list is managed by the Editor, Rick Davies (contact email …(more)
  • Eval-Net [link not working] 858 members as of 2006 (please help update this number) Knowledge sharing and learning continue to be a top corporate priority for UNDP. The purpose of the Evaluation Network is to strengthen UNDP’s evaluation knowledge base by disseminating good practices and lessons learned on monitoring and evaluation to a broad constituency and to foster results-based performance at both country and corporate levels. It will also help build UNDP staff capacity in measuring and assessing results. This network specifically aims to: Share and exchange experiences and knowledge and lessons distilled from evaluative work relating to programmes and projects; Mainstream results orientation within the work of UNDP’s six practice areas; Provide a forum for UNDP staff to share and deepen their knowledge of monitoring and evaluation practices and methodologies. This network is open to all UNDP staff members interested in and working on measuring and assessing results and who want to contribute and build their capacity in this area. (posted 16/12/06)
  • XCeval
    880+ Members, Archives: Membership required XCeval is a listserv for persons interested in issues associated with international and cross-cultural evaluation. Initially set up for the International and Cross-Cultural Topical Interest Group of the American Evaluation Association. Many of the postings (average 34/month) are announcements of short-term consultancies or full-time positions in international M&E-related jobs. Also exchanges of ideas of current interest to persons involved in the evaluation of international development. (updated 15/12/06)
  • American Evaluation Association Electronic Lists
    • EVALTALK Listserv 3100+ members An open, unmoderated list for general discussion of evaluation and associated issues sponsored by the American Evaluation Association. To subscribe, send mail to LISTSERV@BAMA.UA.EDU with the command (paste it!): SUBSCRIBE EVALTALK
    • EVALINFO Sponsored by American Evaluation Association (AEA) as the official electronic network for distribution of information to organization members and interested parties. Anyone can subscribe and receive mailings but non-AEA members cannot post to the list. To subscribe, send an e-mail to LISTSERVE@UA1VM.UA.EDU with the message: SUBSCRIBE EVALINFO <Firstname> <Lastname>

Email lists focusing on specific evaluation themes, issues, or methods

  • AIMEnet Listserv 1000+ members, Archives, Membership required In 2004, MEASURE Evaluation teamed with the U.S. President’s Emergency Plan for AIDS Relief, USAID, CDC, UNAIDS, the World Health Organization, The Global Fund to Fight AIDS, Tuberculosis and Malaria, the World Bank Group, and UNICEF to create the HIV/AIDS Monitoring and Evaluation Network (AIMEnet) listserv. AIMEnet was initially created so we could stay in touch with participants from several Expanded HIV/AIDS Response M&E workshops. Today, the AIMEnet listserv has been broadened to include anyone interested in sharing technical experiences, tools and information in monitoring and evaluation (M&E) of HIV/AIDS programs around the world.
  • Most Significant Changes” technique. 1100+ members. Archives, Membership required. This is moderated by Rick Davies and Jessica Dart (Melbourne). This egroup was formed to promote discussion about the use of an innovative method of monitoring, called the “Most Significant Changes” approach. This is a non-indicator based approach to monitoring, making use of a diversity of narrative accounts of change which are subject to an iterated, open and accountable selection process. It has already been applied in developed and less developed economies, in partipatory rural development projects, agricultural extension projects, educational settings and mainstream human services delivery. Through discussion we hope to promote the wider use of the method, and further innovation and development in the method itself. Most Significant Changes monitoring is different from common monitoring practice in at least four respects: (a) The focus is on the unexpected, (b) Information about those events is documented using text rather than numbers, (c) Analysis of that information is through the use of explicit value judgements, (d) Aggregation of information and analysis takes place through a structured social process. This egroup will act both as a repository of information about people’s experiences with the MSC method to date, and as a nursery for ideas of how to take the method further- into new environments, where there are new opportunities and constraints.
  • Outcome Mapping Learning Community 700+ members globally, as of 2008. Public. Outcome Mapping is an innovative approach to project and programme planning, monitoring and evaluation with a strong focus on participatory learning. The major innovation is the emphasis on the behaviour change of key actors with whom the programme has an influence, rather than focussing on changes in state that may or may not be attributed to the programme. The community was set up to support users of the methodology and those interested in the concepts behind it. Come and discuss the theory, get advice on applying OM in your case and meet others interested in this approach to P,M&E. See the community brochure for more information or contact Simon Hearn.

  • Systems in Evaluation Discussion List
    290+ members, Archives. EVAL-SYS@LISTS.EVALUATION.WMICH.EDU

  • Theory-Based_Evaluation
    390+ Members, Archives: Public Welcome to Theory-Based Evaluation dicsussion list! In the context where evaluation is challenged by attribution, complex systems and the need for evidences based policies, theory-based evaluation is seen as an effective response to these challenges. The purpose of this list, is to provide a forum where practitioners and scholars can exchange and share ideas, lessons and methods associated with theory-based evaluation. Hence, this discussion list is dedicated to the evaluation of Institutional …(more)
  • Pelican Initiative: Platform for Evidence-based Learning & Communications for Social Change 700+ Members, Archives. Membership required. This platform seeks to bring together development practitioners from different disciplines, specialists and policy makers to explore this question, share experiences, and to push the agenda further on three themes: * Evidence and learning for policy change; * Learning in organisations and among partners; * Society-wide learning among a multitude of stakeholders.
  • LEAP IMPACT 160+ members, Archives. Membership required Leap Impact aims to improve the institutional performance of monitoring and evaluation practice related to information services, information products and information projects. It is a community of practice open to all individuals/organisations interested in the evaluation of information. LEAP IMPACT is a joint initiative of CTA, IICD, Bellanet, and KIT. It is facilitated by Sarah Cummings (KIT ILS), Neil Pakenham-Walsh (HIF-net-at-WHO) and Shaddy Shadrach (Oneworld South Asia).
  • NetworkEvaluation
    280+ Members, Archives: Membership required.The Network Evaluation mailing is an extension of the Networks section of Monitoring and Evaluation NEWS at www.mande.co.uk The focus of the Network Evaluation mailing list is on the exchange of information about: Methodologies for, and experiences of, the evaluation of networks. Including Networks of individuals, groups and organisations, Both face to face and electronically mediated networks The use of social network analysis in international development aid projects In planning, …(more)
  • PARTICIPATORY MONITORING AND LEARNING
    60+ Members, Archives: Membership required.This group on Participatory Monitoring and Learning (PM&L) has been created to facilitate interaction amongst a group of researchers, practitioners and others interested in the topic of participatory approaches to monitoring, evaluation and learning.
  • ODAfrica
    50+ Members, Archives: Public. Support group for OD Practitioners working for and in Africa. Initiative of OD Practitioners from Tanzania, Uganda, Ghana, South Africa, Angola, Zimbabwe and Zambia who attended a two-year OD Practitioners Formation Programme in 2004/2005.
  • Evaluation Feedback 30+ members, Archives. Membership required. This was moderated by Catherine Cameron, author of Evaluation feedback for effective learning and accountability,
  • EGAD List: Program evaluation, statistics and methodology list 170+ members. To send a message to all the people currently subscribed to the list, just send mail to EGAD@LISTSERV.ARIZONA.EDU. This is called “sending mail to the list”, because you send mail to a single address and LISTSERV makes copies for all the people who have subscribed. This address (egad@listserv.arizona.edu) is also called the “list address”. You must never try to send any command to that address, as it would be distributed to all the people who have subscribed. All commands must be sent to the “LISTSERV address”, listserv@listserv.arizona.edu.
  • Arlist: Action research mailing list.Arlist-L is a medium-volume, multidisciplinary electronic mailing list. It is a moderated forum for the discussion of the theory and practice of action research and related methods. Bibliography of over 50 references on meta-evaluation. References include discussions of technical competence of individual evaluations, critical analyses of evaluations of the impact of evaluations on the less powerful groups, managerial meta-evaluations on the perceived credibility and utility of the evaluation. To subscribe, send an e-mail (no subject) to request@psy.uq.oz.au with the message: SUBSCRIBE ARLIST <Firstname> <Lastname> Or, To subscribe to arlist-L point your browser at http://lists.scu.edu.au/mailman/listinfo/arlist-l

  • EDIAIS Forum (Enterprise Development Impact Assessment Information Service) 160+ members Joining the list: email info@enterprise-impact.org.uk You will then receive an e-mail asking you to confirm your subscription. Once you are a member: You will receive all messages sent to the list. To send a message to the list mail it to: ENT-IMP@enterprise-impact.org.uk – use either Reply to respond to the last contributor only or Reply All and your message will automatically be mailed to all list members.

Country specific M&E email lists

  • PREVAL – The Programme for Strengthening the Regional Capacity for Monitoring and Evaluation of IFAD’s Rural Poverty-Alleviation Projects in Latin America and the Caribbean owner-preval@mail.rimisp.org 1,400+ members
  • AfrEA
    180+ Members, Archives: Public Information and networking tool of the African Evaluation Association (AfrEA) In conjunction with the national associations and committed donors, AfrEA has helped develop the concept of an African evaluation community. This listserv aims to build on this concept, to broaden this community, by further promoting the sharing of information, resources and opportunities. The AfrEA Community listserv serves as a moderated forum for a wide range of stakeholders, from evaluators who are actively …(more)
    • LateNote: This is a new list recently started at Yahoo! Groups to replace the old list at Topica. Moving the members from the old to the new list is a slow process. However, the old list is still active and has 460 subscribers. (message from Lise Kriel, 30/06/6)
  • indo-monev 440+ Members, Archives: Membership required. This is a mailing list to build a network of Indonesian. People anywhere in the world who are interested, dedicated, and professionalised to the work on monitoring and evaluation and other related development issues as well as development aid works, particularly in Indonesia. This network aims to more exchanging of information, more knowledge building and more awareness on the development monitoring and evaluation issue. Please join.
  • IAEVAL:  340+ members, Archives: Membership required The purpose of this listserv is to enhance communication among members of the US-based International NGO (INGO) community about program design, monitoring, evaluation and effectiveness. The target participants of IAEVAL are those of us who are directly or indirectly responsible for INGO M&E. We hope that this will serve to enhance the communication, shared learning and collaboration among us as persons responsible for evaluation in the US-based INGO community.
  • Relac: 480+ Miembros, Archivos: Se requiere suscripción Este es el grupo de discución de la red de evaluacion de America Latina y el Caribe.
  • REMAPP 150+ members, Archives: Membership required. REMAPP is a [UK-based] group of networking professionals concerned with planning, appraisal, monitoring, evaluation, research and policy issues in aid agencies.
  • MandENigeria
    90+ Members, Archives: Moderators only This listserve is for interested individuals and institutions to share knowledge, opportunities, experience and other resources in M & E. It is also an opportunity to access proffessional consultants in Monitoring and Evaluation in Nigeria and Africa. It is an informal medium to support capacity building, strengthening and dessemination of Monitoring and evaluation information in Nigeria under a Network of Monitoring and Evaluation.Evaluators are advised and encouraged to join and participate …(more)
  • IndiaEvalGroup
    30+ Members, Archives: Membership required. This discussion group consists of evaluators from India or evaluators working on Indian projects. The potential benefits of forming and participating with such a group are: 1. Fellowship with others working in a similar area 3. Encouraging sharing of learning across content and context areas
  • MONEV_NGO
    20+ Members, Archives: Membership required. Establish in Jakarta, Indonesia by 2004. It was started from a group of activists that concern about Monitoring and Evaluation skills that need to be developed by NGOs in Indonesia in general. This is an open forum, so please participate in sharing and discussing lessons learnt and experiences in Monitoring and Evaluation.
  • MandE Ghana 30+ Members, Archives: Membership required This email list has been established for people who have an interest in monitoring and evaluation as applied in Ghana. It is open to people living in Ghana and those elsewhere who have a connection with Ghana. Its aim is to: (a) encourage mutual learning between members, through exchange of questions and answers; (b) make opportunities quickly available to members, concerning M&E related training events, conferences, workshops and employment vacancies; (c) enable members to make contacts with other members with related M&E interests.
  • MandEPilipinas 9 Members, Archives: Membership required. This discussion group is meant for Monitoring and Evaluation professionals in the Philippines. It is a venue to network, exchange ideas and discuss new developments about the field with M&E practitioners in the country to promote mutual learning and intellectual growth.
  • EgDEvalNet < 5 Members, Archives: Membership required This discussion-group was established to discuss the evaluation of development activities in Egypt. This includes: Improving development evaluation activities Exchange of experience between evaluation practitioners Providing feedback for improving development planning Discuss the establishment of an Egyptian Development & Evaluation Association Define standards and guidelines for evaluation practice applicable for the Egyptian environment Develop development evaluation criteria and tools …(more)

How to set up an email list

  • Use Yahoo Groups, as used by many of the email lists shown above.
    • Go to http://groups.yahoo.com/
    • Sign up to get a Yahoo ID (you need to give yourself a username and password, once only).
    • Look for Create Your Own Group
      • Click on Start a Group Now, then follow the instructions
  • Or, use Dgroups
    • Go to http://www.dgroups.org/
      • Dgroups currently supports 1818 groups, containing 60690 members.
    • See if you can work out how to join, and set up a group. It is not easy

Invitation to join a dedicated discussion forum on reconstructing baseline data

From: XCeval@yahoogroups.com On Behalf Of Jim Rugh
Sent: April 28, 2008 5:47 PM
To: XCEval listserv; MandE NEWS
Subject: [XCeval] Invitation to join a dedicated discussion forum on reconstructing baseline data

We realize that any evaluation that purports to be an “impact evaluation” needs to compare “before-and-after” (pre-test + post-test data) and “with-and-without” (the counterfactual – what would have happened without the intervention being evaluated). Yet in our experience the majority of evaluations conducted of development projects and programs do not have comparable baseline data, nor appropriate comparison (much less “control”) groups. Although the discussion of counterfactuals and pre-test + post-test comparisons frequently focuses on quantitative evaluations designs, the need to understand baseline conditions is equally important for qualitative evaluations. What can be done to strengthen evaluations in such cases? In other words, what can be done to reconstruct baseline and counterfactual data?

We (Jim Rugh and Michael Bamberger) are planning a follow-up volume to “RealWorld Evaluation: Working under budget, time, data and political constraints” (Sage Publications 2006). (More information can be found at www.RealWorldEvaluation.org.)
Continue reading “Invitation to join a dedicated discussion forum on reconstructing baseline data”

Impact Assessment: Training to be provided by INTRAC

Date: 07 May 2008 – 09 May 2008
Venue: London, UK

Training event organised by INTRAC

Course fee: £475.00
Number of days: 3
Description:
With increased pressures on delivery and accountability, the need has never been greater for Civil Society and other development organisations to assess the long-term impact of their work. In three fruitful days you will explore the current state of the debate about impact assessment as well as reviewing current practice and methodologies. Learn to assess the effectiveness of your work.Course objectives
• Understand what is meant by impact assessment and how the concept has emerged
• Explore the relationship between impact assessment and other forms of evaluative activity
• Explore different approaches and alternative methodologies in conducting impact assessment
• Identify ways of getting a representative picture e.g. case studies, sampling methods, and triangulation between quantitative and qualitative data
• Consider impact assessment in different contexts e.g. in programmes and projects, organisationally, and in advocacy work

Link to application form

Regulatory Impact Analysis (RIA) Training Course

Date: 6-10 October
Venue: College of Europe, Bruges Campus, Belgium

Dear Colleague,

The College of Europe and Jacobs and Associates Europe invite you to participate in our 5-Day Regulatory Impact Analysis (RIA) Training Course on the principles, procedures, and methods of RIA. This practical, hands-on, course was given in March, and due to demand, will be offered two more times in 2008 — in June and October. The course, by the most experienced public policy and RIA trainers in Europe, is expressly designed for policy officials and executives who use RIA to improve policy results.

The course will benefit any official using RIA in environmental, social and economic fields as well as stakeholders such as business associations, NGOs and consultants who want to understand better how to use RIA constructively. The course is open for subscription worldwide and is presented in the historic city of Bruges, Belgium. A discount is offered for early registration.

Information on RIA Training Course

2008 DATES: 23-27 June and 6-10 October (each course is 5 full days)
LOCATION: College of Europe, Bruges Campus, Belgium
REGISTRATION : For more information and application form go to www.coleurope.eu/ria2008
COST:

  • €2,995 for early registration (includes housing and meals)
  • €3,495 for regular registration (includes housing and meals)

REGISTRATION DEADLINES:

Early registration for the June course runs until 11 May 2008.
Registration closes 1 June 2008.

Early registration for the October course runs until 10 August 2008.
Registration closes on 14 September 2008.

OPEN : World-wide (only 40 seats available per session)
LANGUAGE OF INSTRUCTION: English
COURSE OFFERED BY: College of Europe and Jacobs and Associates Europe

The College of Europe provides a wide range of professional training courses, workshops and tailor-made seminars on the European Union in general or on targeted issues. For more information, please visit:
www.coleurope.eu/training or contact Mrs. Annelies Deckmyn by email: adeckmyn@coleurop.be

Jacobs and Associates continues to offer its tailored RIA training courses on-site around the world, adapted to the client’s needs. To discuss an on-site RIA course, contact ria@regulatoryreform.com. For information on the full range of regulatory reform work by Jacobs and Associates, see http://www.regulatoryreform.com/.

Best wishes,
Marc
Scott Jacobs
Managing Director, Jacobs and Associates Europe

%d bloggers like this: