2010 ENVIRONMENTAL EVALUATORS NETWORKING FORUM

Environmental Evaluation: Quality in an Era of Results-Based Performance
Date: June 7-8, 2010
Venue: Washington, D.C.

The Environmental Evaluators Network (EEN) will host its 5th annual Forum in Washington DC at The George Washington University on June 7,8 2010. The purpose of the EEN is to advance the field of environmental program and policy evaluation through more systematic and collective learning among evaluators and evaluation users. The 5th annual EEN Forum will bring together evaluators and users of evaluation to explore and articulate the significance of the emerging era of results-based performance on the quality of environmental evaluations.

The Issue:

Growing awareness of the interdependencies of our social, economic, and ecological systems requires more efficient use of scarce resources to evaluate complex problems. In this heightened era of accountability, recipients and funders of environmental programs want evidence of what works, and what does not, and better mechanisms for using real time information for decision-making. How will the era of results-based performance affect the quality of environmental evaluations? What must be done to improve the quality of environmental evaluations to meet the requirements and desire for better and more accessible evidence of program and policy effectiveness? Continue reading “2010 ENVIRONMENTAL EVALUATORS NETWORKING FORUM”

“Monitoring and Evaluating Capacity Building: Is it really that difficult?’

By Nigel Simister with Rachel Smith. Published by INTRAC.

“Whilst few doubt the importance of capacity building, and the need for effective monitoring and evaluation (M&E) to support this work, the M&E of capacity building is as much a challenge now as it was two decades ago. This paper examines both theory and current practice, and aims to promote debate on some of the key barriers to progress.

The paper is primarily concerned with capacity building within civil society organisations (CSOs), although many of the lessons also apply to commercial and state organisations. It is based on a literature review and interviews with capacity building providers in the North and South. Continue reading ““Monitoring and Evaluating Capacity Building: Is it really that difficult?’”

Evaluating Climate Change and Development

From: Juha Uitto , via the MandE NEWS email list

Folks,

I’d like to draw your attention to a new book, ‘Evaluating Climate Change and Development‘, edited by Rob van den Berg and Osvaldo Feinstein. This to my knowledge is the first volume that systematically addresses issues relating to climate change from an evaluation point of view. How do we know that the policies, programs and projects targeted towards mitigation, adaptation and vulnerability actually work? The book reviews evidence and approaches and also contains a wealth of case studies from the developing world.
…end of email…

Amazon Abstract: Climate change has become one of the most important global issues of our time, with far-reaching natural, socio- economic, and political effects. To address climate change and development issues from the perspective of evaluation, an international conference was held in Alexandria, Egypt. This book distills the essence of that timely conference, building on the experiences of more than 400 reports and studies presented. Developing countries may be particularly vulnerable to the expected onslaught of higher temperatures, rising sea levels, changing waterfall patterns, and increasing natural disasters. All societies will have to reduce their vulnerability to these changes, and this book describes how vulnerabilities may be addressed in a systematic manner so that governments and local communities may better understand what is happening. Different approaches are also discussed, including the use of human security as a criterion for evaluation as well as ways to deal with risk and uncertainty. “Evaluating Climate Change and Development” presents a rich variety of methods to assess adaptation through monitoring and evaluation. The volume deals with climate change, development, and evaluation; challenges and lessons learned from evaluations; mitigation of climate change; adaptation to climate change; vulnerability, risks and climate change; and, presents a concluding chapter on the road ahead. Collectively the authors offer a set of approaches and techniques for the monitoring and evaluation of climate change.

Editor PS: See also Evaluating adaptation to climate change from a development perspective: Current state of evaluation of climate change adaptation interventions and next stages‘ by Merylyn Hedger,Tom Mitchell,Jennifer Leavy, Martin Greeley, IDS, 2009. Funded by DFID and GEF

Meetings on evaluation and complexity

(Via Ben Ramalingam)

“The latest post on Aid on the Edge post compiles presentations and reports from meetings on evaluation and complexity (NORAD, Panos, Mokoro) that have taken place in the past few months and years, as well as planned meetings later this year”

The list:

Invitation to submit research accountability tools and systems for online database

Dear colleagues

I’m writing to ask you to submit any tools you know of, or have developed, which you think could help build the accountability of research organisations. These will be entered into a publicly available online database.   The One World Trust has been working with our partners to formulate an accountability framework for organisations conducting research (whether civil society, universities, private or public sector). The database will make available tools which will help research managers reflect on and improve the accountability of the programs they manage .   Specifically, we are looking for descriptions and accounts of innovative tools and processes which fall within the following broad areas:

  • Tools suitable for the monitoring and evaluation of research and advocacy;
  • Tools for participatory planning of research;
  • Tools to assist organisational learning and change;
  • Good practices and policies of transparency in research;
  • Good practices for working accountably in partnerships and networks;
  • Community engagement strategies with research participants;
  • Tools to ensure you ‘close the loop’ and manage feedback from your research participants;
  • Examples of where organisations conducting research have integrated external stakeholders into governance structures; and
  • Ethics standards for participatory and applied research.

Ideally, the description would take the form of a document or case study.   For those interested in submitting a tool, please get in contact with the One World Trust team at apro@oneworldtrust.org. For more information on the project, please visit our website. All relevant tools will be included in the database, which we propose to put online and publicly available by August 2010. If we think the tool is not relevant, we do promise to get back to you and explain why!   Thanks very much for your time,   Brendan.


Brendan Whitty
Principal Researcher
One World Trust
3 Whitehall Court, London, SW1A 2EL, UK
> Now Available!  The APRO toolkit provides guidelines outlining accountability principles for research organisations.
http://www.oneworldtrust.org
Tel +44 (0)20 7766 3463
Charity No. 210180

Conference: “Evaluation Revisited: Improving the Quality of Evaluative Practice by Embracing Complexity.”

PS: Videos of the event are now available:  video_1.wmv video_2.wmv video_3.wmv . The EvaluationRevisited conference website also has a number of interesting post conference comments by different participants.

Date: 20-21 May 2010
Venue: Netherlands

On May 20-21, 2010, a conference on evaluation for development will be held in The Netherlands: “Evaluation Revisited: Improving the Quality of Evaluative Practice by Embracing Complexity.”

This conference focuses on how evaluative practice can be improved, given the need to view much of development as a process of social transformation and, therefore, complex. Current evaluation practice has not yet embraced the full implications of assessing ‘the complex’ and existing approaches often fall woefully short. During the conference, participants can explore concrete evaluation practices that reconcile an understanding of complex societal change processes with quality standards, including rigorous, ethical concerns, appropriateness and feasibility. Continue reading “Conference: “Evaluation Revisited: Improving the Quality of Evaluative Practice by Embracing Complexity.””

European Evaluation Society Conference – Evaluation in the Public Interest – Participation, Politics and Policy

Date: 6-8 October 2010
Venue: Prague, Czech Republic

Dear colleagues,

On behalf of the EES Board of Directors we are pleased to announce that the next EES biennial conference will take place in Prague, Czech Republic on 6-8 October 2010, preceded by professional development training sessions on October 4-5. We have launched a new website www.ees2010prague.org where you can find all useful information in regards to the conference. The website will be regularly updated, for newest information please visit section Home – News.

Please find attached a Call for abstract brochure with all information regarding the abstract submission and programme strands.
We want to encourage a wide range of contributions to the conference. Therefore we offer different types of presentation formats, which should be indicated while submitting your abstract. All abstracts are welcome and the best ones are likely to be published or otherwise disseminated by EES. But in addition participants can make use of a wide spectrum of possibilities (panels and symposia, round tables, posters, etc). We also invite creative and original vehicles using the performing arts, film, music, etc.

The Conference topic „Evaluation in the Public Interest – Participation, Politics and Policy“  will be discussed in five strands you are welcome to submit the proposals. Submission strands are:

1.    Ethics, capabilities and transparency
2.    Evaluation and politics
3.    Evaluation producers, beneficiaries, users and decision makers
4.    Sector policy evaluation
5.    Evaluation in developing and transition economies

Please note the overarching theme “methodology, standards, impact and effects”.
As we would like to welcome as many presenters and attendees at this conference as possible, we need your assistance! So kindly pass on this mail to your colleagues and associates and encourage them to arrange for a contributio.

The Conference administration is undertaken by:

CZECH-IN s.r.o.
Professional Event & Congress Organiser
Prague Congress Centre
5. kvetna 65
140 21 Prague 4
Czech Republic

In case of any questions, do not hesitate to contact us at:

Tel: +420 261 174 304
Fax: +420 261 174 307
Email: info@ees2010prague.org, abstracts@ees2010prague.org

Abstract submission is to be open at 18 February 2010

Registration is to be open at 18 February 2010

For more information please visit www.ees2010prague.org

Thank you very much
Looking forward to welcoming you in Prague in October 2010

Conference Secretariat and EES Conference co-chairs Marie Kaufmann & Claudine Voyadzis

American Evaluation Association – Call for Proposals

The American Evaluation Association invites evaluators from around the world to attend its annual conference to be held Wednesday, November 10, through Saturday, November 13, 2010 in San Antonio, Texas. We’ll be convening at the lovely Grand Hyatt San Antonio, right in the heart of the vibrant city and adjacent to the Riverwalk’s nightlife, restaurants, and strolling grounds. Discounted hotel reservations will be available in March.

AEA’s annual meeting is expected to bring together approximately 2500 evaluation practitioners, academics, and students, and represents a unique opportunity to gather with professional colleagues in a collaborative, thought-provoking, and fun atmosphere.

The conference is broken down into 44 Topical Strands that examine the field from the vantage point of a particular methodology, context, or issue of interest to the field as well as the Presidential Strand highlighting this year’s Presidential Theme of Evaluation Quality. Presentations may explore the conference theme or any aspect of the full breadth and depth of evaluation theory and practice.

Proposals are due by midnight in the Eastern time zone, on Friday, March 19, 2010.  More details here

Challenges of Program Evaluation Practice in Africa

Call for Abstracts: Challenges of Program Evaluation Practice in Africa
The purpose of this announcement is to propose the production of a collective book on the challenges of program evaluation in Africa, regardless of the type of sector addressed (health, education, agriculture, etc.). This book is intended to be a critical and reflective analysis of how evaluations are conducted in Africa and what the challenges are of such practice. Through this process, the editors (Valéry Riddle, Seni Kouanda and Jean-François Kobiané) wish to enable francophone publications to take their place in program evaluation literature. Reporting on evaluation practices in Africa and improving methodology by sharing experience — that are both tasks that could be accomplished by chapters of this collective book — is essential.
Authors interested in this subject and wishing to participate in this project are invited to submit a 250-word abstract by February 28, 2010 to the following addresses: valery.ridde@umontreal.ca (valery.ridde@umontreal.ca) ; skouanda@irss.bf (skouanda@irss.bf) ; jfkobiane@issp.bf (jfkobiane@issp.bf) . This step is the first of six stages in the production process ending in December 2010.

INTRAC courses on M&E in 2010

1. Introduction to Monitoring and Evaluation

19-23 April 2010 (repeated 27 September–1 October 2010)

£1250 fully residential (4 nights accommodation and all meals), £999 non-residential (including lunch, refreshments and materials)

Oxford, UK

http://www.intrac.org/events.php?action=event&id=90

Monitoring and Evaluation (M&E) is an essential component of international NGOs, NGOs and civil society organisations striving for greater accountability in their work. There is an increasing demand in the sector for staff to understand what M&E entails, why it is so vital, and how to do it well and in a participatory way. This course will give a thorough introduction to the concepts and practical knowledge and skills needed by new staff, or staff new to M&E. Participants will learn to conduct monitoring and evaluation activities that will help their projects and programmes improve accountability, learning and effectiveness.

Objectives of the course.

At the end of the course, participants will be able to:

  • Define the main terms and concepts associated with the processes of monitoring and evaluating
  • projects and programmes
  • Articulate the key purposes of M&E and be able to prioritise according to the context
  • Select and use a range of tools with confidence
  • Apply results of M&E processes to both accountability and organisational learning.

“My trip to Burundi was fantastic and I really felt I went into it with open eyes after the training…I just wanted you to know how much your training has really helped me launch into monitoring and evaluation. It was just so exciting to finally apply this in the field.”

Danielle Tirello Givens, Program Associate, Africa and the Middle East, Episcopal Relief & Development – a participant on an Introduction to Monitoring and Evaluation course

2. Advanced Monitoring and Evaluation

26-30 April 2010 (repeated on 18-22 October 2010 and 14-18 March 2011)

£1250 fully residential (4 nights accommodation and all meals), £999 non-residential (including lunch, refreshments and materials)

Oxford, UK

http://www.intrac.org/events.php?action=event&id=91

This course explores M&E in greater depth than the introductory course. It is particularly relevant for staff that have a responsibility for those who are either managing or coordinating M&E in projects/programmes, trying to improve and enhance current M&E systems, and/or supporting partners to develop and implement effective M&E. The focus is on building coherent, effective and realistic systems that will serve to improve organisational learning and accountability.

Objectives of the course.

At the end of the course, participants will:

  • Have consolidated their understanding of terms and concepts in M&E
  • Be able to identify characteristics of an effective M&E system
  • Be able to design and implement an effective and contextually appropriate M&E system
  • Be better equipped to address challenges faced by their organisation and their partners in developing effective M&E systems.

3. Impact Assessment

26-28 May 2010 (repeated on 6-8 October 2010 and 23-25 March 2011)

With increased pressure on delivery and accountability, the need has never been greater for development and civil society organisations to assess the longer-term impact of their work. This three-day course explores the challenge of measuring impact and attribution and provides very practical tools and methods.

Objectives of the course

At the end of the course, participants will:

  • Have developed conceptual clarity of impact assessment and its purpose
  • Be able to select and apply appropriate methods and tools of assessing impact from a range of approaches
  • Have considered the relevance of their organisation’s theory of change in relation to impact assessment
  • Be better equipped to ensure that impact assessments contribute effectively to organisational learning and accountability.

“Excellent! As a facilitator myself, I’m really impressed.”

Anna Rambe, Programme Development Officer, Forum Syd – a participant on INTRAC’s Impact Assessment course

%d bloggers like this: