International Conference on National Evaluation Capacities.

 

Date: 12-14 September 2011
Venue: Johannesburg, South Africa

UNDP Evaluation Office and the Public Service Commission (PSC) of South Africa are co-hosting the second International Conference on National Evaluation Capacities. See the official website here.

This is a follow up conference to the 2009 International Conference on National Evaluation Capacities held in Casablanca, Morocco, which was organized by the UNDP Evaluation Office in partnership with the Moroccan National Observatory for Human Development.

Objectives

1. To share experiences from countries with different levels of development of national M&E systems including those who may be considering creating one and have important experiences with other types of evaluation efforts;

2. To identify lessons and constraints in implementing national M&E systems; and,

3. To identify supply and demand for technical assistance in strengthening institutional capacity for national M&E systems under the umbrella of South-South cooperation.

If you have any questions please send your inquiry to: nec.2011@undp.org

Mr. Indran A. Naidoo, Deputy Director General, Monitoring and Evaluation
Office of the Public Service Commission, South Africa

Ms. Azusa Kubota, Evaluation Specialist, UNDP Evaluation Office

Follow in Twitter: @NEC_2011 for the latest information on the International Conference on National Evaluation Capacities – 12-14 September 2011.

A list of M&E training providers

Update 2014 12 20: The contents of this page have become woefully out of date and it would be more than a full time job to keep it up to date.

My advice is as now as follows:

If you are looking for M&E training opportunities visit the MandE NEWS Training Forum, which lists all upcoming training events. There are many training providers listed there, along with links to their websites

Please also consider taking part in the online survey of training needs.

If you are a training provider, please look at the cumulative results to date of that survey.

I have now deleted all the previous training providers that were shown below

A results take-over of aid effectiveness? How to balance multiple or competing calls for more accountability

Date: 25 July 2011 12:00-13:30 (GMT+01 (BST))
Venue: British Academy, London

This debate will explore possible tensions – and opportunities – when donors seek to reassure domestic publics that aid is being spent well, while also endeavouring to support the needs and priorities of aid recipient countries and their citizens.

The language of results is not new – it is integral to the aid effectiveness agenda. But against the backdrop of growing financial constraints, it is receiving renewed emphasis in many donor countries. This debate will explore possible tensions, as well as opportunities, where donors seek to reassure domestic publics that aid is being spent well while they also endeavour to support the needs and priorities of aid recipient countries and their citizens. How can domestic accountability to both these constituencies be supported more effectively? Are there tensions between these different stakeholders and forms of accountability, and how can they be addressed?

Speakers:
Sarah Cliffe – Special Representative and Director, World Development Report 2011: Conflict, Security, and Development
Sue Unsworth – The Policy Practice, and ODI Board Member
Alan Hudson – Senior Policy Manager, Governance (Transparency & Accountability), ONE
John Morlu – former Auditor General, Liberia
Chair:  Alison Evans – Director, ODI

An ODI and BBC World Service Trust public event in the Busan and beyond: aid effectiveness in a new era series.

Click for more details           Register to attend this event

ISO International Workshop Agreement (IWA) on Evaluation Capacity Development

Date: 17-21 October 2011
Venue: John Knox Centre, Geneva, Switzerland

Dear Colleagues:

A proposal prepared by the Evaluation Capacity Development Group (ECDG) and the Joint Committee on Standards for Educational Evaluation (JCSEE), in partnership with the International Organization for Cooperation in Evaluation (IOCE), to create an International Workshop Agreement (IWA) on evaluation capacity development (ECD) was recently approved by the International Organization for Standardization (ISO).

Everyone agrees that there is an acute need to develop evaluation capacity. However, resolution of the problem has not been possible because there is no agreement on HOW to develop evaluation capacity. Some think that individual evaluators should be better trained through workshops and seminars.  Others think that organizations should be redesigned to enable the achievement of a shared vision for evaluation. And, yet others think that evaluation should be institutionalized in national governments to promote accountability to their citizens.

We are now organizing a workshop that will be held 17-21 October 2011 at the John Knox Centre, Geneva, Switzerland.  The workshop will use a systems approach to develop an IWA that integrates ECD at the individual, organizational and national levels.  I am particularly pleased to inform you that a leading expert in systems-based evaluation, Bob Williams, has consented to facilitate the event.

As per the procedures explained in Annex SI of the Supplement to the ISO/IEC Directives, ANY organization with an interest in evaluation capacity development can register to send a representative to the workshop to participate in the preparation of this important document. Limited support may be available.  To learn more about the workshop and to register please go to http://www.ecdg.net/

Best Regards,

Karen Russon
President
Evaluation Capacity Development Group

10th European Evaluation Society Biennial Conference, Helsinki, Oct 2012

Date: 3-5 October 2012
Venue: Helsinki, Finland

EVALUATION IN THE NETWORKED SOCIETY: NEW CONCEPTS, NEW CHALLENGES, NEW SOLUTIONS

The Tenth Biennial Conference of the European Evaluation Society will be the international evaluation event of the year. It will be held in Helsinki, Finland during 3-5 October 2012 (pre-conference workshops 1- 2 October). Mark your calendars!!

Evaluators are living in times of unprecedented challenge and opportunity. The networked information environment is inducing fundamental changes in culture, politics and society. Whereas the industrial society was reliant on centralised, hierarchical, high cost information systems, the networked society is characterised by decentralised, voluntary and cheap information exchange.

The advent of social networking without borders will have fundamental implications for evaluation agendas and methods. First, it will redefine the value and legitimacy of evaluation in global social accountability networks and accelerate the internationalisation of evaluation. Second, evaluation cultures, structures and processes will have to deal  with the limitless quantity, speed and accessibility of information generated by new technologies, e.g. drawing useful meaning from huge data bases, assessing the validity of an exploding number of rating systems, league tables, etc. in ways consistent with democratic values of freedom of expression and protection of privacy.

The new information technologies offer new ways of making authority responsible and accountable as well as bringing real time citizen involvement and reliable information to bear on public policy making. What are the implications of an information economy that allows instant connectivity to thousands of program beneficiaries suddenly able to make their voices heard? Will the spread of mobile telephony to the weakest and most vulnerable members of society and the rising power of social networks act as evaluative and recuperative mechanisms or will they merely aggravate social instability? What are the risks of network capture by single or special interest groups and cooptation of evaluation?

The rise of the evaluation discipline is inextricably linked to the values central to any democratic society. How will these values be protected in a context where weak links and increasing inequalities have created new fissures in society? How will evaluation independence be protected against the pressures of vested interests intent on retaining control over the commanding heights of the society?

To help explore these and other issues relevant to the prospects of evaluation in Europe and beyond the Conference will stimulate evaluators to share ideas, insights and opinions about a wide range of topics that will throw light on the future roles of evaluation in the networked society. The Conference will help draw evaluation lessons learnt in distinct sectors and regions of the world. It will also examine the potential of alternative and mixed evaluation methods in diverse contexts and probe the challenges of assessing public interest in complex adaptive systems and networks.

To these ends the Conference will offer participants a wide choice of vehicles for the transmission of evaluation experience and knowledge: keynote speeches, paper presentations, panel debates, posters, etc.  As in past years the EES Conference will aim at a pluralistic agenda that respects the legitimacy of different standpoints, illuminates diverse perspectives and promotes principled debate. The Conference will also provide an opportunity for evaluation networks to interact and improve the coherence of their activities.

We look forward to welcoming you in Helsinki. It is one of the world leaders in modern design and it provides Europe with a a world class high tech platform. It also boasts a 450 year history and lays claim to being the warmest, friendliest, most “laid back” city of Northern Europe. Its nearby archipelago of islands offers an ideal environment for sea cruises and its neighboring old growth forests provide an idyllic setting for restful nature walks. We promise you an enjoyable as well as a professionally rewarding time!!

Ian Davies, President, European Evaluation Society
Maria Bustelo, Vice President and President Elect, European Evaluation Society

www.europeanevaluation.org 

Randomised controlled trials, mixed methods and policy influence in international development – Symposium

Thinking out of the black box. A 3ie-LIDC Symposium
Date: 17:30 to 19:30 Monday, May 23rd 2011
Venue: John Snow Lecture Theatre, London School of Hygiene and Tropical Medicine (LSHTM) Keppel Street, London, WC1E 7HT

Professor Nancy Cartwright, Professor of Philosophy, London School of Economics
Professor Howard White, Executive Director, 3ie
Chair: Professor Jeff Waage, Director, LIDC

Randomised  Controlled  Trials  (RCTs)  have  moved  to  the  forefront  of  the development  agenda  to  assess  development  results  and  the  impact  of development  programs.  In  words  of  Esther  Duflo  –  one  of  the  strongest advocates of RCTs – RCTs allow us to know which development efforts help and which cause harm.

But  RCTs  are  not  without  their  critics,  with  questions  raised  about  their usefulness, both  to provide more substantive  lessons about  the program being evaluated and whether the findings can be generalized to other settings.

This symposium brings perspectives from the philosophy of science, and a mixed method approach to impact analysis, to this debate.

ALL WELCOME
For more information contact: 3ieuk@3ieimpact.org

PS1: Nancy Cartwright wrote “Are RCTs the Gold Standard?” in 2007

PS2: The presentation by Howard White is now available here  – http://tinyurl.com/3dwlqwn but without audio

AusAID’s Information Publication Scheme: Draft Plan & Consultation

The 12th April 2011 Draft plan is now available in pdf and MS Word

Introduction

“AusAID is the Australian Government’s Agency for International Development, an executive agency within the Department of Foreign Affairs and Trade portfolio. Its primary role is the implementation and oversight of the Australian Government aid program. The aim of the program is to assist
developing countries reduce poverty and achieve sustainable development, in line with Australia’s national interest.

Reforms to the Freedom of Information Act 1982 (FOI Act) have established the Information Publication Scheme (IPS). The purpose of the IPS is to give the Australian community access to information held by the Australian Government and enhance and promote Australia’s representative
democracy by increasing public participation in government processes and increasing scrutiny, discussion, comment and review of government activities and decisions.

AusAID is committed to greater transparency through the implementation of the Information Publication Scheme (IPS) and other initiatives that will introduced. As Australia’s ODA commitment has increased, public interest in the aid program has correspondingly increased and this will
continue. Implementation of the IPS will provide more information to Australians about AusAID’s activities and help increase public participation understanding and scrutiny of Australia’s aid program.

This draft plan has been prepared to assist AusAID implement the IPS, in accordance with section 8(1) of the Freedom of Information Act (FOI) 1982 and to give the Australian public the opportunity to comment and provide feedback on this plan.

As AusAID’s final plan is implemented it will be progressively updated in light of experience and feedback. The list of documents that is a core part of this plan will, in particular, be amended.”

The consultation: Visit this AusAid website to see how to participate and to read the views of others who have already contributed.

 

SLEvaluationAssoc 2011 International Conference in Colombo

Date: 8-9 June
Venue: Colombo, Sri Lanka
The Sri Lanka Evaluation Association (SLEvA) will hold the SLEvA 2011 International Conference in Colombo, Sri Lanka on 8-9 June. The conference will be preceded by pre conference professional development workshops conducted by leading professionals in the field as well as professionals working to build the evaluation field in South Asia such as the Community of Evaluators, members of the Consortium of Academic Institutions for Teaching Evaluation in South Asia TESA and the International Organisation for Collaboration in Evaluation.

The conference is expected to bring together around 150 professionals in evaluation, academics and members of regional and global evaluation associations. The conference will be of interest to professionals and will provide an opportunity for sharing knowledge and ideas with professionals and practitioners in evaluation and to learn of initiatives in South Asia.

The overall theme of the conference will be ‘Evaluation for Policy and Action’ with the following as subthemes.

  • Evaluation for influencing policy and policy evaluation
  • Evaluation for supporting development programmes
  • Evaluation in disaster reduction and management
  • Evaluating networks and partnerships
  • Building the evaluation field
  • Evaluation methodologies and approaches
  • Other evaluation issues

The conference website is at www.sleva.lk

SLEvA invites paper abstracts, proposals for panel discussions, exhibits and displays, net working events and welcome your ideas for sharing information and learning

INTRAC, PSO & PRIA Monitoring and Evaluation Conference

Monitoring and evaluation: new developments and challenges
Date: 14-16 June 2011
Venue: The Netherlands

This international conference will examine key elements and challenges confronting the evaluation of international development, including its funding, practice and future.

The main themes of the conference will include: governance and accountability; impact; M&E in complex contexts of social change; the M&E of advocacy; M&E of capacity building; programme evaluation in an era of results-based management; M&E of humanitarian programmes; the design of M&E systems; evaluating networks, including community driven networks; changing theories of change and how this relates to M&E methods and approaches. Overview of conference

Call for M&E Case Studies

Case study abstracts (max. 500 words) are invited that relate to the conference themes above, with an emphasis on what has been done in practice. We will offer a competition for the best three cases and the authors will be invited early to the UK to work on their presentation for a plenary session. We will also identify a range of contributions for publication in Development in Practice.
Download the full case study guidelines, and submit your abstracts via email to Zoe Wilkinson.

Case studies abstracts deadline: 11 March 2011

‘Realist evaluation – understanding how programs work in their context’; An expert seminar with Dr. Gill Westhorp; Wageningen, the Netherlands; 29-03-2011

FormDate: 29-03-2011
Venue: Wageningen, the Netherlands

Dear colleague,

With pleasure we would like to announce an expert seminar with Dr. Gill Westhorp on 29th March 2011: ‘Realist evaluation – understanding how programs work in their context’.

‘Realist evaluation (Pawson and Tilley, 1997) is one type of theory based evaluation.  It aims to explore “what works, for whom, in what contexts, to what extent and how”.  It adopts a particular understanding of how programs work, and uses a particular format for program theories to help guide evaluation design, data collection and analysis.

Realist evaluation has a particular focus on understanding the interactions between programs and their contexts and the ways that these influence how programs work. Evaluation expert Dr. Gill Westhorp will discuss the concepts and assumptions that underpin this theory based evaluation approach. What is it that realist evaluation brings to the table of evaluating development programs? How is it different from existing approaches in evaluation in development? How does it understand, and deal with, complexity? What new insights can help strengthen the utility of evaluation for development?

During the morning, Gill will introduce the basic assumptions and key concepts in realist evaluation.  She will also briefly demonstrate how these ideas can be built into other evaluation models using two examples.  These models – realist action research and realist program logic – are participatory models which were designed for use in settings where limited resources, lack of capacity to collect outcomes data, complex programs, and (sometimes) small participant numbers make evaluation difficult.  In the afternoon, the practical implications for evaluation design, data collection and analysis will be discussed. Examples and practical exercises will be included throughout the day.

For those interested and not to far away around that time, please do come and join this interesting event!

Please find attached the Factsheet flyer and Registration Form form. We also suggest you make an early hotel booking (http://www.hofvanwageningen.nl/?language=en)  as the hotel is already quite full. Please indicate to the hotel that you are booking a room for the ‘expert seminar realist evaluation’.

Note: the expert seminar with Dr. Michael Quinn Patton on ‘developmental evaluation’ unfortunately had to be cancelled due to personal reasons. We hope to organise another opportunity with him early next year.

Looking forward to meeting you here at the expert seminar on realist evaluation!

Cecile Kusters (CDI), Irene Guijt (Learning by Design), Jan Brouwers (Context, international cooperation) and Paula Bilinsky (CDI)

Kind regards / Hartelijke groeten,

Cecile Kusters
Participatory Planning, Monitoring & Evaluation – Managing for Impact
Multi-Stakeholder Processes and Social Learning
Centre for Development Innovation
Wageningen UR
P.O. Box 88, 6700 AB Wageningen, The Netherlands
Tel. +31 (0)317 481407 (direct), +31 (0)317 486800 (reception)
Fax +31 (0)317 486801
e-mail cecile.kusters@wur.nl
www.cdi.wur.nl<http://www.cdi.wur.nl/>
PPME resource portal: http://portals.wi.wur.nl/ppme/
MSP resource portal: http://portals.wi.wur.nl/msp/
www.disclaim<http://www.disclaimer-uk.wur.nl>er-uk.wur.nl

%d bloggers like this: