“Instruments, Randomization and Learning about Development”

Angus Deaton,  Research Program in Development Studies, Center for Health and Wellbeing, Princeton University, March, 2010 Full text as pdf

ABSTRACT
There is currently much debate about the effectiveness of foreign aid and about what kind of projects can engender economic development. There is skepticism about the ability of econometric analysis to resolve these issues, or of development agencies to learn from their own experience. In response, there is increasing use in development economics of randomized controlled trials (RCTs) to accumulate credible knowledge of what works, without over-reliance on questionable theory or statistical methods. When RCTs are not possible, the proponents of these methods advocate quasi-randomization through instrumental variable (IV) techniques or natural experiments. I argue that many of these applications are unlikely to recover quantities that are useful for policy or understanding: two key issues are the misunderstanding of exogeneity, and the handling of heterogeneity. I illustrate from the literature on aid and growth. Actual randomization faces similar problems as does quasi-randomization, notwithstanding rhetoric to the contrary. I argue that experiments have no special ability to produce more credible knowledge than other methods, and that actual experiments are frequently subject to practical problems that undermine any claims to statistical or epistemic superiority. I illustrate using prominent experiments in development and elsewhere. As with IV methods, RCT-based evaluation of projects, without guidance from an understanding of underlying mechanisms, is unlikely to lead to scientific progress in the understanding of economic development. I welcome recent trends in development experimentation away from the evaluation of projects and towards the evaluation of theoretical mechanisms.

See also Why Works? by Lawrence Hadded, Development Horizons blog

See also  Carlos Baharona’s Randomised Control Trials for the Impact Evaluation of Development Initiatives: A Statistician’s Point of View. Introduction: This [ILAC Working Paper]  paper contains the technical and practical reflections of a statistician on the use of Randomised Control Trial designs (RCT) for evaluating the impact of development initiatives. It is divided into three parts. The first part discusses RCTs in impact evaluation, their origin, how they have developed and the debate that has been generated in the evaluation circles. The second part examines difficult issues faced in applying RCT designs to the impact evaluation of development initiatives, to what extent this type of design can be applied rigorously, the validity of the assumptions underlying RCT designs in this context, and the opportunities and constraints inherent in their adoption. The third part discusses the some of the ethical issues raised by RCTs, the need to establish ethical standards for studies about development options and the need for an open mind in the selection of research methods and tools.

WEBINAR SERIES: EMERGING PRACTICES IN DEVELOPMENT EVALUATION

UNICEF, the Rockefeller Foundation and Claremont Graduate University (CGU), in partnership with IOCE and DevInfo, are pleased to announce a series of live webinars on “Emerging Practices in Development Evaluation”. This series has allowed CGU to expand greatly on our previous online offerings, and includes a range of guest speakers with experience in development evaluation.

Please see the full program below, or visit MyM&E at http://www.mymande.org/?q=content/emerging-practices-in-development-evaluation&x=cl

The first webinar on “Using a developing country lens in evaluation“, with Zenda Ofir, former President, African Evaluation Association, and Shiva Kumar, independent consultant, India, will take place on Wednesday, October 13th. For additional information and instruction, please visit http://www.mymande.org/sites/default/files/emerging_practices_ofir_kumar_final.pdf

Webinars are free and open to interested people. You may attend virtually from your personal or work computer anywhere in the world. In addition to watching live presentations, you will have the option to ask questions and provide comments.

These webinars will enable the sharing of good practices and lessons learned. Global-level speakers will contribute international perspectives. No prior registration is required. To attend, you just need a computer and Internet connection.

Following scheduled webinars are as following:

  • Emerging Practices in Evaluating Policy Influence, by Fred Carden, Director, Evaluation Unit, International Development Research Center (IDRC), 16 November 2010
  • Evaluating Networks and Partnerships, by Jared Raynor, Senior Consultant at TCC Group, 7 December 2010
  • Evaluating Capacity Development, by Peter Morgan, Independent Consultant, January 2011
  • Evaluating Organizational Performance, by Charles Lusthaus, Co-Founder and Chairman of the Board of Directors, Universalia Management Group; and Associate Professor , McGill University, February 2011
  • Evaluating Innovation, by Steve Rochlin, Director and U.S. Representative for AccountAbility, March 2011
  • Evaluating Sustainable Development, by Alastair Bradstock, Business Development Director, International Institute for Environment and Development (IIED), April 2011

Announcements with detailed instructions will be sent out the week before the actual webinar. Please excuse cross-postings.

Paul Thomas
Director of External Affairs
School of Behavioral and Organizational Sciences
Claremont Graduate University

Do Less Transparent Donors Allocate Aid Differently?

Jörg Faust , German Development Institute D-I-E, 2010, APSA 2010 Annual Meeting Paper. Available as pdf

Abstract:
“Foreign aid is said to be more effective for development if it is allocated to relatively poor recipient countries’ with relatively sound political institutions. This allocation rule also meets the preferences of citizens in donor countries, who expect their government to spent aid on countries that are needy and institutionally prepared to use it well. Unfortunately, aid allocation in the past often has diverged from this rule because donor governments and other bureaucratic agents often pursue special interest politics. This paper studies the variance of aid allocation patterns across donor countries. It relates this variance of aid allocation patterns to different levels of political transparency within donor countries. Where political transparency is high, donor governments are more accountable and have less maneuvering space to diverge from technocratic expertise and citizen’s preferences. An empirical test, using data for the 1998-2008 period confirms this hypothesis. Donor countries with higher levels of political transparency allocate aid more according to recipient countries’ neediness and institutional performance”

Reflexive Monitoring in Action: A guide for monitoring system innovation projects

“Researchers at Wageningen University and the VU University Amsterdam, the Netherlands, have been working together on a type of monitoring that they have called reflexive monitoring in action (RMA).  RMA has been developed especially for projects that aim to contribute to the sustainable development of a sector or region by working on system innovation.   Sustainable development demands simultaneous changes at many levels of society and in multiple domains: ecological, economic, political and scientific. It requires choices to be made that are radically different from the usual practices, habits, interrelationships and institutional structures. But that is precisely why it is not easy. System innovation projects therefore benefit from a type of monitoring that encourages the ‘reflexivity’ of the project itself, its ability to affect and interact with the environment within which it operates. If a project wants to realise the far-reaching ambitions of system innovation, then reflection and learning must be tightly interwoven within it. And that learning should focus on structural changes. RMA can contribute to this.   In the guide, -aiming at supporting the work of project managers, monitors and clients-, the authors present the characteristics and the value of Reflexive Monitoring in Action, together with practical guidelines that will help put that monitoring into practice. At the end of the guide the authors provide detailed descriptions of seven monitoring tools.”

The guide can be freely downloaded in pdf format, in English or Dutch, from http://tinyurl.com/wurcispubs or http://tinyurl.com/vupubs.

The guide is also available in printed version (Dutch only), through Boxpress (http://www.boxpressshop.nl). Price: € 49,95 (full colour) or € 29,95 (black-white with pictures in full colour). For more information please contact: barbara.vanmierlo@wur.nl

DfID Seeks Suggestions for Implementing Aid Transparency Initiative

on Devex, By Eliza Villarino on 06 September 2010

“The U.K. Department for International Development launches an online discussion to seek input on how it should implement the UKaid Transparency Guarantee.

The U.K. Department for International Development has opened an online discussion to help it decide how to implement its aid transparency initiative.

The UKaid Transparency Guarantee forms part of the coalition government’s commitment to boost the transparency of DfID aid. As reported by Devex, U.K. Secretary of State for International Development Andrew Mitchell announced the guarantee, along with the intention to create an independent aid watchdog, in June.

DfID is urging civil society groups, think tanks and other organizations working on transparency to send an e-mail to aidtransparency@dfid.gov.uk if they wish to contribute to the discussion.”

PS – 19th October 2010: A summary of the online discussion is now available here as a pdf: 2010 Summary of Huddle Discussions on UKATG

“Britons think development aid for poor countries is wasted”

Mark Tran, Guardian.co.uk, Wednesday 8 September 2010 11.40 BST

“More than half of Britons think development aid is wasted and do not support the coalition government’s policy of ring-fencing assistance for poor countries, a survey shows.

Aid to Developing Countries: Where does the UK Public Stand?, published by the Institute of Development Studies (IDS) in Brighton, recommends development groups take a new approach to communicating with the public about how and when aid works to address perceptions that most aid is wasted.

The aid budget is protected from spending cuts because the government is committed to meeting the UN target of spending 0.7% of national income on aid by 2013, but the survey found that 63% of people think aid to poor countries should be cut as the government seeks to reduce the budget deficit, while 52% think most UK aid to developing countries is ineffective…”

Stories from Aidland: Dancing to the Tune

(From The Broker)

Dancing to the tune, by Nancy Okail, July 2010

This story chronicles my involvement in the Organisation for Economic Co-operation and Development (OECD) monitoring survey on aid effectiveness in a North African country in 2006. The OECD monitoring survey was a tool designed to assess how aid was spent in this recipient country and measure progress in relation to the five dimensions of the Paris Declaration: ownership, alignment, harmonization, results and mutual accountability. I was a participant observer at one of the offices in the country’s Ministry of International Cooperation entrusted to conduct the survey.

DFID Draft Structural Reform Plan July 2010

Available  on the DFID website and as a pdf.

“Structural Reform Plans are the key tool of the Coalition Government for making departments accountable for the implementation of the reforms set out in the Coalition Agreement. They replace the old, top-down systems of targets and central micromanagement.

The reforms set out in each department’s SRP are designed to turn government on its head, taking power away from Whitehall and putting it into the hands of people and communities. Once these reforms are in place, people themselves will have the power to improve our country and our public services, through the mechanisms of local democratic accountability, competition, choice, and social action.

The reform plans set out in this document are consistent with and form part of the Department’s contribution to the Spending Review. All departmental spending is subject to the Spending Review.

We have adopted a cautious view of the timescales for delivering all legislative measures due to the unpredictability of pressures on Parliamentary time.”
Continue reading “DFID Draft Structural Reform Plan July 2010”

Launch of online database of research accountability tools

Announcement: 7 September: launch of online database of research accountability tools

The One World Trust, with support from the IDRC, has created an interactive, online database of tools to help organisations conducting policy relevant research become more accountable.

Processes of innovation and research are fundamental to improvements in quality of life and to creating a better society. But to realise these benefits, the quality of research alone is not enough. Organisations engaged in policy-relevant research and innovation must continually take into account and balance the needs of a diverse set of stakeholders: from the intended research users, to their clients and donors, to the research community and the research participants.  Responsiveness to all of these is crucial if they are to be legitimate and effective. In this, accountable processes are as important as high quality research products.

The Trust has built the online accountability database to support researchers, campaigners and research managers to think through the way they use evidence to influence policy in an accountable way. The database takes into account that research organisations are increasingly diverse – they are no longer just  universities, but private companies, public institutes and non-profit think-tanks. No single framework can encompass this diversity.

Instead, the database provides an inventory of over two hundred tools, standards and processes within a broad, overarching accountability framework. With a dynamic interface and several search functions, it allows users to identify aspects of accountability that interests them, and provides ideas to improve their accountability in this context. Each tool is supported by sources and further reading.

We also encourage engagement with and discussion on the database content, through allowing users to comment on individual tools, or to submit their own tools, processes and standards for inclusion.

The database is an output of a three-year project, titled “Accountability Principles for Research Organisations.” Working with partners across the globe, the project has generated an accountability framework which is sufficiently flexible to apply to many contexts and different organisations.

The database will be available online from the 7 September.

For more information about the project please feel free to contact us at bwhitty@oneworldtrust.org. For the database, please visit www.oneworldtrust.org/apro

Conference: Systemic Approaches in Evaluation

Date: January 25th/ 26th 2011
Venue: Eschborn, Germany

Call for Papers & Save the Date

Development programs promote complex reforms and change processes. Today, such processes are characterized more than ever by insecurity and unpredictability, posing a big challenge to the evaluation of development projects. In order to understand which projects work, why and under which conditions, evaluations need to embrace the interaction of various influencing factors and the multi-dimensionality of societal change. However, present evaluation approaches often premise predictability and linearity of event chains. They reflect the natural human need for security but are often not suitable to comprehend complex situations.

In order to fill this gap systemic approaches in evaluation of development programs are increasingly being discussed. A key concept is interdependency instead of linear cause-effect-relations. Systemic evaluations look at interrelations instead of analyzing isolated facts and figures. They focus on the interaction between various stakeholders with different motivations, interests, perceptions, and perspectives.

On January 25th and 26th, the Evaluation Unit of GTZ offers a forum to discuss systemic approaches to evaluation on an international conference with participants from politics, science and practice. On the basis of presentations, discussion rounds and case studies we will tackle, amongst others, the following questions:

·         What characterizes a “systemic evaluation“?
·         What is new about systemic evaluations, what makes them different from other (e.g.  participatory) approaches?
·         For which kind of evaluations are systemic approaches (not) useful?
·         Which concrete methods and tools from systemic consulting can be used?
·         Which quality standards do systemic evaluations have to meet?
·         Which specific methods and tools from systemic consultation practice can be used in systemic evaluation?

We welcome contributions on good practice examples and/or systemic tools for evaluation. Please submit your proposals (1000 words maximum) in English by October 31st to Sabine Dinges (sabine.dinges@gtz.de).

We look forward to receiving your abstracts. Further information on the registration process will soon be provided.

Martina Vahlhaus, Head of Evaluation Unit ,
Michael Gajo ,  Senior Evaluation Officer
gtz German Technical Cooperation
P.O. 5180
65726 Eschborn

%d bloggers like this: