The Evaluation of Storytelling as a Peace-building Methodology

Experiential Learning Paper No. 5
January 2011

www.irishpeacecentres.org

This paper is the record of an international workshop which was held in Derry in September 2010 on the evaluation of storytelling as a peace-building methodology. This was an important and timely initiative because currently there is no generally agreed method of evaluating storytelling despite the significant sums of money invested in it, not least by the EU PEACE Programmes. It was in fact PEACE III funding that enabled this examination of the issue to take place. This support allowed us to match international experts in evaluation with experts in storytelling in a residential setting over two days. This mix proved incredibly rich and produced this report, which we believe is a substantial contribution to the field. It is an example of the reflective practice which is at the heart of IPC’s integrated approach to peace-building and INCORE’s focus on linking research with peace-building practice. Built on this and other initiatives, one of IPC’s specific aims is to create a series of papers that reflect the issues which are being dealt with by practitioners.

Contents:
Foreword 4
Introduction 5
Presentations, Interviews and Discussions 13
Final Plenary Discussion 52
Conclusions:
a. What we have learned about storytelling 65
b. What we have learned about the evaluation of storytelling 69
c. What next? 73
Appendix 1: Reflection Notes from Small
Discussion Groups 75
Appendix 2: How does storytelling work in violently divided societies? Questioning the link between storytelling and peace-building 112
Appendix 3: Workshop Programme 116
Appendix 4: Speaker Biographies 118
Appendix 5: Storytelling & Peace-building References and Resources 122

PS: Ken Bush has passed on this message:

Please find attached an updated copy of the Storytelling and Peacebuilding BIBLIOGRAPHY.  Inclusion of web addresses makes it particularly useful.

INTRAC, PSO & PRIA Monitoring and Evaluation Conference

Monitoring and evaluation: new developments and challenges
Date: 14-16 June 2011
Venue: The Netherlands

This international conference will examine key elements and challenges confronting the evaluation of international development, including its funding, practice and future.

The main themes of the conference will include: governance and accountability; impact; M&E in complex contexts of social change; the M&E of advocacy; M&E of capacity building; programme evaluation in an era of results-based management; M&E of humanitarian programmes; the design of M&E systems; evaluating networks, including community driven networks; changing theories of change and how this relates to M&E methods and approaches. Overview of conference

Call for M&E Case Studies

Case study abstracts (max. 500 words) are invited that relate to the conference themes above, with an emphasis on what has been done in practice. We will offer a competition for the best three cases and the authors will be invited early to the UK to work on their presentation for a plenary session. We will also identify a range of contributions for publication in Development in Practice.
Download the full case study guidelines, and submit your abstracts via email to Zoe Wilkinson.

Case studies abstracts deadline: 11 March 2011

Policy Practice Brief 6 – What Makes A Good Governance Indicator?

January 2011, Gareth Williams

http://www.thepolicypractice.com/papersdetails.asp?code=17

The rise to prominence of good governance as a key development concern has been marked by an increasing interest in measurement and the production of a huge range of governance indicators. When used carefully such indicators provide a valuable source of information on governance conditions and trends. However, when used carelessly they can misinform and mislead. The purpose of this brief is to make sense of the different types of governance indicator and how they are used and misused. It warns against the commission of ‘seven deadly sins’ representing the most common pitfalls. The paper puts forward guidelines to ensure a more careful use and interpretation of governance indicators, and highlights the need for providers of indicators to be subject to greater transparency, scrutiny, evaluation and peer review. From the perspective of political economy analysis the challenge is to make the indicators more relevant to understanding the underlying political processes that are the key drivers of better governance.

‘Realist evaluation – understanding how programs work in their context’; An expert seminar with Dr. Gill Westhorp; Wageningen, the Netherlands; 29-03-2011

FormDate: 29-03-2011
Venue: Wageningen, the Netherlands

Dear colleague,

With pleasure we would like to announce an expert seminar with Dr. Gill Westhorp on 29th March 2011: ‘Realist evaluation – understanding how programs work in their context’.

‘Realist evaluation (Pawson and Tilley, 1997) is one type of theory based evaluation.  It aims to explore “what works, for whom, in what contexts, to what extent and how”.  It adopts a particular understanding of how programs work, and uses a particular format for program theories to help guide evaluation design, data collection and analysis.

Realist evaluation has a particular focus on understanding the interactions between programs and their contexts and the ways that these influence how programs work. Evaluation expert Dr. Gill Westhorp will discuss the concepts and assumptions that underpin this theory based evaluation approach. What is it that realist evaluation brings to the table of evaluating development programs? How is it different from existing approaches in evaluation in development? How does it understand, and deal with, complexity? What new insights can help strengthen the utility of evaluation for development?

During the morning, Gill will introduce the basic assumptions and key concepts in realist evaluation.  She will also briefly demonstrate how these ideas can be built into other evaluation models using two examples.  These models – realist action research and realist program logic – are participatory models which were designed for use in settings where limited resources, lack of capacity to collect outcomes data, complex programs, and (sometimes) small participant numbers make evaluation difficult.  In the afternoon, the practical implications for evaluation design, data collection and analysis will be discussed. Examples and practical exercises will be included throughout the day.

For those interested and not to far away around that time, please do come and join this interesting event!

Please find attached the Factsheet flyer and Registration Form form. We also suggest you make an early hotel booking (http://www.hofvanwageningen.nl/?language=en)  as the hotel is already quite full. Please indicate to the hotel that you are booking a room for the ‘expert seminar realist evaluation’.

Note: the expert seminar with Dr. Michael Quinn Patton on ‘developmental evaluation’ unfortunately had to be cancelled due to personal reasons. We hope to organise another opportunity with him early next year.

Looking forward to meeting you here at the expert seminar on realist evaluation!

Cecile Kusters (CDI), Irene Guijt (Learning by Design), Jan Brouwers (Context, international cooperation) and Paula Bilinsky (CDI)

Kind regards / Hartelijke groeten,

Cecile Kusters
Participatory Planning, Monitoring & Evaluation – Managing for Impact
Multi-Stakeholder Processes and Social Learning
Centre for Development Innovation
Wageningen UR
P.O. Box 88, 6700 AB Wageningen, The Netherlands
Tel. +31 (0)317 481407 (direct), +31 (0)317 486800 (reception)
Fax +31 (0)317 486801
e-mail cecile.kusters@wur.nl
www.cdi.wur.nl<http://www.cdi.wur.nl/>
PPME resource portal: http://portals.wi.wur.nl/ppme/
MSP resource portal: http://portals.wi.wur.nl/msp/
www.disclaim<http://www.disclaimer-uk.wur.nl>er-uk.wur.nl

2010 Annual Report on Results and Impact of IFAD Operations

“The IFAD Office of Evaluation (IOE) has released its eighth Annual Report on Results and Impact of IFAD Operations (ARRI), based on evaluations carried out in 2009. The main objectives of the report are to highlight the results and impact of IFAD-funded operations, and to draw attention to systemic issues and lessons learned with a view to further enhancing the Fund’s development effectiveness. The report synthesizes results from 17 projects evaluated in 2009 and draws upon four country programme evaluations in Argentina, India, Mozambique and Niger, as well as two corporate-level evaluations, namely on innovation and gender.

This year’s ARRI found that the performance of past IFAD-supported operations is, on the whole, moderately satisfactory. However, performance of operations has improved over time in a number of areas (e.g. sustainability, IFAD’s own performance as a partner, and innovation), even though additional improvements can be achieved in the future. The ARRI also found that projects designed more recently tend to perform better than older-generation operations, as overall objectives and design are more realistic and they devote greater attention to results management.

There are areas that remain a challenge, such as efficiency, and natural resources and environment.  The ARRI also notes that IFAD can do more to strengthen government capacity, which is one of the most important factors in achieving results on rural poverty. With regard to efficiency, the ARRI notes that the efficiency of IFAD-funded projects has improved from a low base in 2004/5, even though there is room for further enhancement. Efficiency will be studied by IOE in much deeper detail in 2011, within the framework of the planned corporate level evaluation on the topic.

The report is available on the IFAD website, at the following address: http://www.ifad.org/evaluation/arri/2010/arri.pdf

Hard copies of the report are available with IOE and can be requested via e-mail to evaluation@ifad.org.

Cordial regards,

IOE Evaluation Communication Unit

For further information, please contact:
Mark Keating
Evaluation Information Officer
Office of Evaluation
IFAD
Tel: ++39-06-54592048
e-mail: m.keating@ifad.org

Call for Abstracts – Forthcoming Sri Lankan Evaluation Association Conference June 6 -10, 2011

Date: June 6 -10, 2011
Venue: Colombo, Sri Lanka

Dear colleagues
The Sri Lanka Evaluation Association (SLEvA) will convene its fourth Evaluation conference in Colombo this coming June. The conference will start with two days of professional development workshops and will provide an opportunity for sharing knowledge and ideas with development practitioners, evaluators, users of evaluation and policy makers. Please find attached CALL FOR ABSTRACTS.

The overall theme of the conference will be ‘Evaluation for Policy and Action’with the following as subthemes: Evaluation for influencing policy and policy evaluation: Evaluation for supporting development programmes; Evaluation in disaster reduction and management; Evaluating networks and partnerships; Building the evaluation field; Evaluation methodologies and approaches; and other evaluation issues.#

SLEvA invites abstracts of papers and proposals for panels, posters and exhibits. The abstracts should not be more than 250 words and should be sent by 15 April to sleva@sltnet.lk cc´ing somadesilva@gmail.com

Full information on the conference will be available in the coming days on the SLEvA website: http://www.sleva.lk/

With warm regards,

Ada Ocampo
Regional Advisor – Evaluation
Asia-Pacific Shared Services Centre (APSSC)
UNICEF
19 Phra Atit Road, Bangkok 10200, Thailand
Tel: +66 2 356 9272 Fax: +66 2 280 5941
Email: aocampo@unicef.org
http://www.unicef.org/

“The Truth Wears Off” & “More Thoughts on the Decline Effect”

These two articles by Jonah Lehrer in the New Yorker (13/12/2010, 03/01/2011)  provide a salutary reminder of the limitations of experimental methods as a means of research, when undertaken by mere mortals, with their various limitations. They do not, in my reading, dismiss or invalidate the use of experimental methods, but they do highlight the need for a longer term view about the efficacy of experimental methods, and one which situates their use within a social context.

Many thanks to Irene Guijt for bringing these articles to my attention, and others, via the Pelican email list.

“In “The Truth Wears Off,” Lehrer “wanted to explore the human side of the scientific enterprise. My focus was on a troubling phenomenon often referred to as the “decline effect,” which is the tendency of many exciting scientific results to fade over time. This empirical hiccup afflicts fields from pharmacology to evolutionary biology to social psychology. There is no simple explanation for the decline effect, but the article explores several possibilities, from the publication biases of peer-reviewed journals to the “selective reporting” of scientists who sift through data”

In “More Thoughts on the Decline Effect” Lehrer responds to the various emails, tweets and comments posted in reply to his article. Amongts other things, he says “I think the decline effect is an important reminder that we shouldn’t simply reassure ourselves with platitudes about the rigors of replication or the inevitable corrections of peer review. Although we often pretend that experiments settle the truth for us—that we are mere passive observers, dutifully recording the facts—the reality of science is a lot messier. It is an intensely human process, shaped by all of our usual talents, tendencies, and flaws”

There are also many interesting Comments following both posts by Lehrer

The CES Learning and Innovation Prize is open for entries.

Closing date 17 January 2011

Charities Evaluation Services is celebrating the ways in which charities use monitoring information or evaluation findings to improve their work and influence others with the new Learning and Innovation Prize.

The Prize is aimed at highlighting the contribution that monitoring and evaluation makes to improving service delivery, not just accountability, and rewarding organisations who make the best use of the information that they have.

Please note: the deadline for entries is 5pm, 17th January 2011.

Prize categories

This inspiring new award is split into four categories:

  • small charities (annual turnover under £500,000)
  • large charities (annual turnover over £500,000)
  • funders
  • organisations that support other charities.

Who can enter

Organisations that fit one of the above categories and who have used monitoring information or evaluation findings to improve their work and influence others can enter. For further information and specific criteria, please see the Entry Guidelines below.

We are looking for situations where monitoring or evaluation was done by the organisation itself or where an external evaluator was involved. Winners will be expected to demonstrate evidence that the findings changed something about project or service delivery or use of their resources, or influenced others to do so.

For more information and to download an entry form visit: http://www.ces-vol.org.uk/prize

Charities Evaluation Services (CES) is the UK’s leading provider of training, consultancy and information on evaluation and quality systems in the third sector. We also publish PQASSO, the most widely used quality system in the sector.

CES is an independent charity. We work with third sector organisations and their funders.

Impact Evaluation Conference: “Mind the Gap”: From Evidence to Impact

Date: June 15-17 2011
Venue: Cuernavaca, Mexico

Each year billions of dollars are spent on tackling global poverty. Development programs and policies are designed to build sustainable livelihoods and improve lives. But is there real evidence to show which programs work and why? Are government and donor policies based on concrete and credible evidence?

The Mind the Gap conference on impact evaluation will address these questions and offer possible solutions. With a focus on Latin American Countries the conference will take place in Cuernavaca, Mexico, June 15-17, 2011. Co-hosted by The International Initiative for Impact Evaluation (3ie), the National Institute of Public Health of Mexico (INSP), the Inter-American Development Bank (IADB) and the Center for Labor and Social Distributive Studies in coordination with the Impact Evaluation Network and the Poverty and Economic Policy Network (CEDLAS-IEN-PEP).

This conference will provide a platform to share and discuss experiences on how to best achieve evidence-based policy in sectors that are highly relevant for Latin America. To this end, the conference will mainstream a policy-focus into all its activities. The plenary sessions will address the challenges and progress made in building evidence into policy-making processes. The sector-focused sessions will be asked to address the engagement of stakeholders and policy-makers in the various studies presented. The conference will be preceded by a range of pre-conference clinics tailored to the interests and needs of both researchers and program managers.

The conference will accommodate only 400 attendees. The official languages of the Conference are Spanish and English. Simultaneous translation will be provided for all conference sessions.Please register early to secure your attendance. Registration will open March 1st. 2011. Early bird rates will be offered.

Check the conference website often for up to date conference information.  http://www.impactevaluation2011.org/

Bursaries are being made available to developing country participants with a proven interest in impact evaluation.

Bursary applications will open March 1st giving preference to authors of accepted abstracts.

Expert seminar with Dr MQ Patton ‘Developmental evaluation’

Date: 29th March 2011
Venue: in the Netherlands

NOW CANCELLED. Further information will be provided when available

With pleasure we would like to announce an expert seminar with Dr. Michael Quinn Patton on ‘Developmental evaluation – new kid on the evaluation block’.

Developmental evaluation is based on insights from complex dynamic systems, uncertainty, nonlinearity, and emergence. World renowned, award-winning evaluation expert Dr. Michael Quinn Patton will discuss the developmental evaluation framework as detailed in his book `Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use’. Patton will illustrate how developmental evaluation can be used for a range of purposes: ongoing program development; adapting effective principles of practice to local contexts; generating innovations and taking them to scale; and facilitating rapid response in crisis situations.

Participant discussions will focus on development evaluation’s value for the development sector. How is developmental evaluation different from existing practice, ongoing reflective monitoring – in development? What new insights can help strengthen the utility of evaluation for development?

During the morning, Dr. Patton will explain developmental evaluation and illustrate it with many examples from his own experience. In the afternoon, participants will debate the practical application of developmental evaluation in development, based on participants’ existing evaluation questions.

For those interested and not to far away around that time, please do come and join this interesting event!

For more info and registration: http://www.cdi.wur.nl/UK/newsagenda/agenda/DevelopmentalEvaluation_MichaelPatton
We suggest to make an early hotel booking (http://www.hofvanwageningen.nl/?language=en) as the hotel is already quite full. Also indicate to the hotel that you are booking a room for the ‘expert seminar with Patton’.

Looking forward to meeting you here!

Cecile Kusters (CDI), Irene Guijt (Learning by Design), Jan Brouwers (Context, international cooperation) and Karel Chambille (Hivos)
Kind regards / Hartelijke groeten,

Cecile Kusters
Participatory Planning, Monitoring & Evaluation – Managing for Impact
Multi-Stakeholder Processes and Social Learning
Centre for Development Innovation
Wageningen UR
P.O. Box 88, 6700 AB Wageningen, The Netherlands
Tel. +31 (0)317 481407 (direct), +31 (0)317 486800 (reception)
Fax +31 (0)317 486801
e-mail cecile.kusters@wur.nl
www.cdi.wur.nl
PPME resource portal: http://portals.wi.wur.nl/ppme/
MSP resource portal: http://portals.wi.wur.nl/msp/
www.disclaimer-uk.wur.nl