Research Integration Using Dialogue Methods

David McDonald, Gabriele Bammer, Peter Deane, 2009 Download pdf

Ed: Although about “research integration”  the book is also very relevant to the planning and evaluation of development projects

“Research on real-world problems—like restoration of wetlands, the needs of the elderly, effective disaster response and the future of the airline industry—requires expert knowledge from a range of disciplines, as well as from stakeholders affected by the problem and those in a position to do something about it. This book charts new territory in taking a systematic approach to research integration using dialogue methods to bring together multiple perspectives. It links specific dialogue methods to particular research integration tasks.

Fourteen dialogue methods for research integration are classified into two groups:

1. Dialogue methods for understanding a problem broadly: integrating judgements

2. Dialogue methods for understanding particular aspects of a problem: integrating visions, world views, interests and values.

The methods are illustrated by case studies from four research areas: the environment, public health, security and technological innovation.”

Analyzing the Effects of Policy Reforms on the Poor: An Evaluation of the Effectiveness of World Bank Support to Poverty and Social Impact Analyses

World Bank, 2010

“The World Bank introduced the Poverty and Social Impact Analysis (PSIA) approach in fiscal 2002 to help governments and the Bank anticipate and address the possible consequences of proposed policy reforms, especially on the poor and vulnerable, and to contribute to country capacity for  policy analysis. By fiscal 2007 the Bank had undertaken 156 pieces of analytical work using one or more elements of the PSIA approach (hereafter called PSIAs) in 75 countries and 14 sectors. Total donor support to PSIAs over fiscal 2004–06 was $15 million, which came from the Bank’s earmarked Incremental Fund for PSIAs ($5.8 million), earmarked PSIA Trust Funds contributed by various bilateral donors, and non-earmarked Bank budget and other donor funding.”…

“Although  the Bank has  submitted progress reports  to donors  regarding  the  implementation  of  PSIAs,  it  has  not  yet  completed a  comprehensive  self-evaluation  of  the PSIA  experience.  This  evaluation  by  the Independent  Evaluation Group,  requested by  the Bank’s Board of Executive Directors, represents the first independent evaluation of the PSIA experience.”

Full text available online

Quantification of qualitative data in the water sector: The challenges

by Christine Sijbesma and Leonie Postma

Published in Water International, Volume 33, Issue 2, June 2008 pp. 150-161 (Full text >here<)

Abstract

Participatory methods are increasingly used in water-related development and management. Most information gathered with such methods is qualitative. The general view is that such information cannot be aggregated and is therefore less interesting for managers. This paper shows that the opposite can be the case. It describes a participatory methodology that quantifies qualitative information for management at all levels. The methodology was developed to assess the sustainability of community-managed improved water supplies, sanitation and hygiene. It allows correlation of outcomes to processes of participation and gender and social equity and so assess where changes are needed. The paper also describes how elements of positivistic research such as sampling were included. Application in over 15 countries taught that such quantified qualitative methods are an important supplement to or an alternative for social surveys. While the new approach makes statistical analysis possible, it also increases the risk that participatory methods are used extractively when the demand for data on outcomes dominates over quality of process. As a result the collection of qualitative information and the use of the data for community action and adaptive management gets lost. However, when properly applied, quantification offers interesting new opportunities. It makes participatory methods more attractive to large programmes and gives practitioners and donors a new chance to adopt standards of rigor and ethics and so combine quantification with quality protection and developmental use.

PARTICIPATORY MONITORING AND EVALUATION WORKSHOP

Date: July 27-August 1, 2009
Venue: Held at the University of  Ottawa, Canada

This Six-Day PM&E Workshop will show you how to:

*Rethink your own monitoring & evaluation strategies and approaches;
*Master and learn new innovative participatory PM & E tools for the workplace;
*Facilitate PM & E processes for your project, programme or organization;
*Develop monitoring and evaluation plans in a more participatory manner;
* Integrate gender, ethnicity, class and sexuality issues and concerns to your PM&E work;
*Integrate qualitative and participatory methods into monitoring and evaluation.
Continue reading “PARTICIPATORY MONITORING AND EVALUATION WORKSHOP”

International Course on ‘Participatory Planning, Monitoring & Evaluation. Managing and Learning for Impact”

Date: 02-20  March 2009,
Venue: Wageningen, The Netherlands

This course is organised by Wageningen International, part of Wageningen University and Research Centre. The course focuses on how to design and institutionalise participatory planning and M&E systems in projects, programmes and organisations for continuous learning and enhanced performance. Particular attention is paid to navigating and managing for impact and to the relationship between management information needs and responsibilities and the planning and M&E functions. For more info please visit our website: http://www.cdic.wur.nl/UK/newsagenda/agenda Participatory_planning_monitoring_and_evaluation.htm

or contact us: training.wi@wur.nl or cecile.kusters@wur.nl

Participants are coming from all over the world, both government, NGO and academic sector, and mainly in management positions or M&E functions. You are most welcome to join this selective group!

Kind regards / Hartelijke groeten,

Cecile Kusters
Participatory Planning, Monitoring & Evaluation
Multi-Stakeholder Processes and Social Learning
Wageningen UR, Wageningen International
P.O.Box 88, 6700 AB Wageningen, the Netherlands
Visiting address: Lawickse Allee 11,Building 425, 6701 AN Wageningen, The Netherlands
Tel.  +31 (0)317- 481407
Fax. +31 (0)317- 486801
e-mail: cecile.kusters@wur.nl
Website: www.cdic.wur.nl/UK
PPME resource portal: http://portals.wi.wur.nl/ppme
MSP resource portal: http://portals.wi.wur.nl/msp/
www.disclaimer-uk.wur.nl

Participatory Impact Assessment: a Guide for Practitioners

The Feinstein International Center has been developing and adapting participatory approaches to measure the impact of livelihoods based interventions since the early nineties. Drawing upon this experience, the guide aims to provide practitioners with a broad framework for carrying out project level Participatory Impact Assessments (PIA) of livelihoods interventions in the humanitarian sector. Other than in some health, nutrition, and water interventions in which indicators of project performance should relate to international standards, for many interventions there are no ‘gold standards’ for measuring project impact. This guide aims to bridge this gap by outlining a tried and tested approach to measuring the impact of livelihoods projects. The tools in the guide have been field tested over the past two years in a major research effort, funded by the Bill & Melinda Gates Foundation and involving five major humanitarian NGOs working across Africa.

Download a PDF copy of the guide here

GRDC Helpdesk Research Report: Monitoring and Evaluation of Participation in Governance

M&E of Participation in Governance: Please identify toolkits, methodologies and indicators for monitoring and evaluating the effectiveness of programmes aimed at improving governance (particularly of urban infrastructure/services). Please highlight methods of relevance to NGOs for monitoring and evaluating poor people’s participation in decision-making processes.

Helpdesk response
Key findings: There is generally very little information available on evaluating the effectiveness of the inclusive/participatory aspects of governance programmes. A particular difficulty is that there is a limited understanding of what improvements in governance actually look like. Nevertheless, some common principles identified in the literature include the need for both quantitative and qualitative indicators and the importance of focusing on purpose, processes, context and perception as well as outputs and outcomes.

Some common indicators for assessing the effectiveness of participatory programmes include:

  • the level of participation of different types of stakeholders
  • institutional arrangements to facilitate engagement
  • active engagement of stakeholders in the programme, and confidence and willingness to get involved in future
  • the extent to which participants are mobilising their own resources
  • transparent access to and use of resources
  • equality of access to decision-making
  • transformation of power through e.g. new relationships and access to new networks
  • level of trust and ownership of the process behavioural changes of stakeholders (values, priorities, aims)
  • level of self-reliance, self-management, capacity and understanding of the issues sustainability and ability to resolve conflict.

Full response: http://www.gsdrc.org/docs/open/HD549.pdf

Produced by the Governance and Social Development Resource Centre

Annual Praxis Commune on Participatory Development

Date: 19th – 28th August, 2008
Venue: KILA Campus, Thrissur (Kerala), India

This residential workshop acts as a forum for participants from across the world to come together for reflection and learning. It provides both, a theoretical understanding of participatory approaches/tools as well as the opportunity to apply them in the field. The ten days include general and specific module based theory, three days in various rural, peri-urban and urban field settings, as appropriate to the module content and finally a sharing, reflection and feedback session. Continue reading “Annual Praxis Commune on Participatory Development”

3rd annual Measuring Effectiveness conference:’Participation, Empowerment and Downward Accountability’.

Greetings,

The 3rd annual Measuring Effectiveness conference will be held in Melbourne, on Thursday 25th & Friday 26th September, 2008.

We sincerely hope that you will again be inspired to attend this important event. This year sees World Vision Australia and The Australian National University partnering to bring you a conference themed around ‘Participation, Empowerment and Downward Accountability’.

Attached is the call for papers, requested for submission by Friday 20th June, 2008. That gives you 8 weeks to submit your paper. Competition is increasing each year, so please ensure that you meet this deadline to ensure your paper is given full consideration. For all further details please refer to the attached. Please also distribute this amongst your colleagues and networks who may also be interested.

Conference updates will be posted regularly on the World Vision website, and registrations will again be managed online.We will endeavour to have the conference brochure available online in late May 08, and the final draft conference program, outlining the speakers/presenters and session outlines, available online by late August 08. Further email correspondence will also be sent out in the coming months, however the best source of informatoin will be the website so please check this regularly. You will also find information and papers from previosu ME conferences, as well as other development conferences.

http://www.worldvision.com.au/learn/conferences/index.asp

Please distribute this email amongst your colleagues and networks who may also be interested.

Regards,

Melissa Cadwell | Program Coordinator |
Program Effectiveness | World Vision Australia
phone / fax: +61 3 9287 2769
Email : measuringeffectiveness@worldvision.com.au

Website : http://www.worldvision.com.au/learn/conferences/index.asp

Predicting the achievements of the Katine project

September 2010: This post provides information on a revised proposal for a “Predictions Survey” on the achievements of the Katine Community Partnerships Project, a project managed by AMREF and funded by the Guardian and Barclays Bank, between 2007 and 2011.

Background Assumptions

The Guardian coverage of the Katine project has provided an unparalleled level of public transparency to the workings of an aid project. As of August 2010 there have been approximately 530 articles posted on the site, most of which have specifically about Katine. These posts have included copies of project documentation (plans, budgets, progress reports, review reports) that often don’t enter the public realm.

Ideally this level of transparency would have two benefits: (a) improving UK public knowledge about the challenges of providing effective aid, (b) imposing some constructive discipline on the work of the NGO concerned, because they know they are under continuing scrutiny not only locally, but internationally. Whether this has actually been the case is yet to be systematically assessed. However I understand the effects on the project and its local stakeholders  (i.e b above) will be subject to review by Ben Jones later this year, and then open to discussion in a one day event in November, to be organised by the Guardian.

So far there have been two kinds of opportunities for the British, and other publics, to be engaged with the public monitoring of the Katine project. One has been through posting comments on the articles on the Guardian website. About 30% of all articles have provided this opportunity, and these articles have attracted an average of 5 comments . The other option has been by invitation from the Guardian, to make a guest posting on the website. This invitation has been extended to specialists in the UK and elsewhere.  Multiple efforts have also been made to hear different voices from within the Katine community itself

The Predictions Survey would provide another kind of opportunity for participation. It would be an opportunity for a wide range of participants to:

  • to make some judgments about the overall achievements of the project, and
  • to explain those judgments, and
  • to see how those judgments compared to that of others, and
  • to see how those judgments compare to the facts, about what has actually been achieved at the end of the project

In addition a Predictions Survey would provide a means of testing expectations that greater transparency can improve public knowledge about the challenges of providing effective aid.

My proposal is that that the Prediction Survey would consist of five batches of questions, one for each project component, on a separate page. Each question would be a multiple choice question, but associated with an optional Comment field. People could respond on the basis of their existing knowledge of the project (which could vary widely) and/or extra information about the website obtained via component specific links embedded at the head of each page of the online survey e.g. on water and sanitation. Questions at the end of the survey would identify participants’ sources of knowledge about the project (e.g. obtained before and during the survey, from the website and elsewhere).

A 1st rough draft survey form is already available to view. Any responses entered at this stage may be noted, but they will then be deleted and not included in any final analysis.  The final design of the survey will require close consultation with AMREF and the Guardian.

Intended participants in the survey

  • UK public, reached via the Guardian
  • Uganda public, reached via Ugandan newspapers (likely to be more of a challenge)
  • AMREF staff, especially in Uganda, Kenya HQ and UK
  • The Guardian and Barclays, as donors
  • Monitoring and Evaluation specialists, reached via an international email list

Hypotheses (predictions about the predictions)

  1. We might expect that AMREF would be able to make the most accurate predictions, given its central role. But aid agencies are often tempted to put a gloss on their achievements, because of the gap that sometimes emerges between their ambitions and what can actually be done in practice.
  2. We might expect that participants who have been following the Guardian coverage closely since the beginning might be better informed and make better predictions than others who have become interested more recently. But perhaps those participants are still responding on the basis of their original beliefs (aka biases)?
  3. We might expect M&E specialists to make better than average predictions because of their experience in analysing project performance. But perhaps they have become too skeptical about everything they read
  4. We might expect the Guardian and Barclays staff to make better than average predictions because they have been following the project closely since inception and their organisation’s money is  invested in it. But perhaps they only want to see success.
  5. We might expect the highest frequency choices (across all groups) to be more accurate than the choices of any of the above groups, because of a ” wisdom of crowds” effect. The potential of crowdsourcing was of interest to the Guardian at the beginning of the project, and this survey could be seen as a form of crowdsourcing – of judgements.

This list is not final. Other hypotheses  could be identified in the process of consultation over the design of the survey

There may also be other less testable predictions worth identifying. For example, about the effects of this Prediction Survey on the work done by AMREF and its partners in the final year up to October 2011. Might it lead to a focus on what is being measured by the survey, to the detriment of other important aspects of their work?  If AMREF has a comprehensive monitoring framework and the prediction survey addresses the same breadth of performance (and not just one or two performance indicators) this should not be a problem.

Timeframe

The fourth and final year of the project starts in October 2010 and ends in October 2011.

The finalisation of the design of the Predictions Survey will require extensive consultation with AMREF and the Guardian, in order to ensure the fullest possible ownership of the process, and thus the results that are generated. Ideally this process might be completed by late-October 2010

The survey could be open from late October to the end of March 2011 (six months before the end of the project). All responses would be date stamped to take account of any advantages of being a later participant

A process will need to be agreed in 2010 on how objective information can be obtained on which of the multiple choice options have eventuated by October 2011.

A post 2011 follow up survey may be worth considering. This would focus on predictions of what will happen in the post-project period, up to 2014, the year of the vision statement produced by participants in the September 2009 stakeholders workshop in Katine.

“In 2014, Katine will be an active, empowered community taking responsibility for their development with decent health, education, food security and able to sustain it with the local government”

Supporters

The participation of the Guardian and AMREF will be very important, although it is conceivable that the survey could be run independently of their cooperation

Assistance with publicity, to find participants, would be needed from the Guardian and Barclays

Advisory support is being sought from the One World Trust

Advisory support from other other organisations could also be useful

The online survey could be designed and managed by Rick Davies. However responsibility could be given to another party that was agreed to by AMREF, Guardian and Barclays.

Challenges

  • The survey design needs to be short enough to encourage people to complete it, but not so short that important aspects of the project’s performance are left out
  • The description of the objectives used in the survey needs to be as clear and specific as possible, but also keep as close to AMREF’s original words as possible (i.e. as in the 4th year extension proposal, and using the M&E framework, now being updated)
  • Participants will be asked to make a single choice between multiple options, describing what might happen. These options will need to be carefully chosen, so there are no obvious “no brainers”, and to cover a range of plausible possibilities
  • It may be necessary in some cases (e.g. with some broadly defined objectives) to allow multiple choices from multiple options
  • I have heard that AMREF will be conducting a final evaluation in late 2011, using an external consultant. This evaluation could be the source of the final set of data on actual performance, against which participant’s predictions could be compared. But will it be seen as a sufficiently independent source of information?
%d bloggers like this: