Improving the Evaluability of INGO Empowerment and Accountability Programmes

Shutt, C. and McGee, R. CDI Practice Paper 1 March 2013 Publisher IDS Available as pdf (109kb)

This CDI Practice Paper is based on an analysis of international NGO (INGO) evaluation practice in empowerment and accountability (E&A) programmes commissioned by CARE UK, Christian Aid, Plan UK and World Vision UK. It reviews evaluation debates and their implications for INGOs. The authors argue that if INGOs are to successfully ‘measure’ or assess outcomes and impacts of E&A programmes, they need to shift attention from methods to developing more holistic and complexity-informed evaluation strategies during programme design. Final evaluations or impact assessments are no longer discrete activities, but part of longer-term learning processes. Given the weak evaluation capacity within the international development sector, this CDI Practice Paper concludes that institutional donors must have realistic expectations and support INGOs to develop their evaluation capacity in keeping with cost–benefit considerations. Donors might also need to reconsider the merits of trying to evaluate the ‘impact’ of ‘demand-side’ NGO governance programmes independently of potentially complementary ‘supply-side’ governance initiatives.

See also: Tools and Guidelines for Improving the Evaluability of INGO Empowerment and Accountability Programmes Centre for Development Impact, Practice paper. No.1 Annex March 2013

RCTs for empowerment and accountability programmes

A GSDRC Helpdesk Research Report, Date: 01.04.2011, 14 pages, available as pdf.

Query: To what extent have randomised control trials been used to successfully measure the results of empowerment and accountability processes or programmes?
Enquirer: DFID
Helpdesk response
Key findings: This report examines the extent to which RCTs have been used successfully to measure empowerment and accountability processes and programmes. Field experiments present immense opportunities, but the report cautions that they are more suited to measuring short-term results with short causal chains and less suitable for complex interventions. The studies have also demonstrated divergent results, possibly due to different programme designs. The literature highlights that issues of scale, context, complexity, timeframe, coordination and bias in the selection of programmes also determine the degree of success reported. It argues that researchers using RCTs should make more effort to understand contextual issues, consider how experiments can be scaled up to measure higher-order processes, and focus more on learning. The report suggests strategies such as using qualitative methods, replicating studies in different contexts and using randomised methods with field activities to overcome the limitations in the literature.
1. Overview
2. General Literature (annotated bibliography)
3. Accountability Studies (annotated bibliography)
4. Empowerment Studies (annotated bibliography)


Measuring Empowerment? Ask Them

Quantifying qualitative outcomes from people’s own analysis. Insights for results-based management from the experience of a social movement in Bangladesh Dee Jupp Sohel Ibn Ali with contribution from Carlos Barahona 2010: Sida Studies in Evaluation. Download pdf


Participation has been widely taken up as an essential element of development, but participation for what purpose? Many feel that its acceptance, which has extended to even the most conventional of institutions such as the international development banks, has resulted in it losing its teeth in terms of the original ideology of being able to empower those living in poverty and to challenge power relations.

The more recent emergence of the rights-based approach discourse has the potential to restore the ‘bite’ to participation and to re-politicise development. Enshrined in universal declarations and conventions, it offers a palatable route to accommodating radicalism and creating conditions for emancipatory and transformational change, particularly for people living in poverty. But an internet search on how to measure the impact of these approaches yields a disappointing harvest of experience. There is a proliferation of debate on the origins and processes, the motivations and pitfalls of rights-based programming but little on how to know when or if it works. The discourse is messy and confusing and leads many to hold up their hands in despair and declare that outcomes are intangible, contextual, individual, behavioural, relational and fundamentally un-quantifiable!

As a consequence, results-based management pundits are resorting to substantive measurement of products, services and goods which demonstrate outputs and rely on perception studies to measure outcomes.

However, there is another way. Quantitative analyses of qualitative assessments of outcomes and impacts can be undertaken with relative ease and at low cost. It is possible to measure what many regard as unmeasurable.

This publication suggests that steps in the process of attainment of rights and the process of empowerment are easy to identify and measure for those active in the struggle to achieve them. It is our etic perspectives that make the whole thing difficult. When we apply normative frames of reference, we inevitably impose our values and our notions of democracy and citizen engagement rather than embracing people’s own context-based experience of empowerment.

This paper presents the experience of one social movement in Bangladesh, which managed to find a way to measure empowerment by letting the members themselves explain what benefits they acquired from the Movement and by developing a means to measure change over time. These measures , which are primarily of use to the members, have then been subjected to numerical analysis outside of the village environment to provide convincing quantitative data, which satisfies the demands of results-based management.

The paper is aimed primarily at those who are excited by the possibilities of rights-based approaches but who are concerned about proving that their investment results in measurable and attributable change. The experience described here should build confidence that transparency, rigour and reliability can be assured in community led approaches to monitoring and evaluation without distorting the original purpose, which is a system of reflection for the community members themselves. Hopefully, the reader will feel empowered to challenge the sceptics.

Dee Jupp and Sohel Ibn Ali
Continue reading “Measuring Empowerment? Ask Them”

Measuring Effectiveness – Participation, Empowerment and Downward Accountability

Date: 25th-26th September
Venue: Melbourne


The annual Measuring Effectiveness conference is now only 6 weeks away. Online registrations are open on the website :

The conference will be held on Thursday 25th and Friday 26th September, 2008 in Melbourne, Australia. This year the conference is being held in partnership with The Australian National University. The 2008 conference will explore the themes of “Participation, Empowerment and Downward Accountability.”

The sessions will include case studies & panel discussions on these themes as well as looking at geographical activities in Asia, the Pacific and Latin America/Caribbean.
The draft conference schedule and session summaries will be available on the conference website within the next few days.
Continue reading “Measuring Effectiveness – Participation, Empowerment and Downward Accountability”

3rd annual Measuring Effectiveness conference:’Participation, Empowerment and Downward Accountability’.


The 3rd annual Measuring Effectiveness conference will be held in Melbourne, on Thursday 25th & Friday 26th September, 2008.

We sincerely hope that you will again be inspired to attend this important event. This year sees World Vision Australia and The Australian National University partnering to bring you a conference themed around ‘Participation, Empowerment and Downward Accountability’.

Attached is the call for papers, requested for submission by Friday 20th June, 2008. That gives you 8 weeks to submit your paper. Competition is increasing each year, so please ensure that you meet this deadline to ensure your paper is given full consideration. For all further details please refer to the attached. Please also distribute this amongst your colleagues and networks who may also be interested.

Conference updates will be posted regularly on the World Vision website, and registrations will again be managed online.We will endeavour to have the conference brochure available online in late May 08, and the final draft conference program, outlining the speakers/presenters and session outlines, available online by late August 08. Further email correspondence will also be sent out in the coming months, however the best source of informatoin will be the website so please check this regularly. You will also find information and papers from previosu ME conferences, as well as other development conferences.

Please distribute this email amongst your colleagues and networks who may also be interested.


Melissa Cadwell | Program Coordinator |
Program Effectiveness | World Vision Australia
phone / fax: +61 3 9287 2769
Email :

Website :