How to do a rigorous, evidence – focused literature review in international development

 

A Guidance Note, by
Jessica Hagen-Zanker and Richard Mallett
ODI Working Paper, September 2013
Available as pdf

Abstract: Building on previous reflections on the utility of systematic reviews in international development research, this paper describes an approach to carrying out a literature review that adheres to some of the core principles of ‘full’ systematic reviews, but that also contains space within the process for innovation and reflexivity. We discuss all stages of the review process, but pay particular attention to the retrieval phase, which, we argue, should consist of three interrelated tracks important for navigating difficult ‘information architecture’. We end by clarifying what it is in particular that sets this approach apart from fuller systematic reviews, as well as with some broader thoughts on the nature of ‘the literature review’ within international development and the social sciences more generally. The paper should thus be seen as sitting somewhere between a practical toolkit for those wishing to undertake a rigorous, evidence focused review and a series of reflections on the role, purpose and application of literature reviews in policy research

Special Issue on Systematic Reviews – J. of Development Effectiveness

Volume 4, Issue 3, 2012

  • Why do we care about evidence synthesis? An introduction to the special issue on systematic reviews
  • How to do a good systematic review of effects in international development: a tool kit
    • Hugh Waddington, Howard White, Birte Snilstveit, Jorge Garcia Hombrados, Martina Vojtkova, Philip Davies, Ami Bhavsar, John Eyers, Tracey Perez Koehlmoos, Mark Petticrew, Jeffrey C. Valentine & Peter Tugwell  pages 359-387Download full text
  • Systematic reviews: from ‘bare bones’ reviews to policy relevance
  • Narrative approaches to systematic review and synthesis of evidence for international development policy and practice
  • Purity or pragmatism? Reflecting on the use of systematic review methodology in development
  • The benefits and challenges of using systematic reviews in international development research
    • Richard Mallett, Jessica Hagen-Zanker, Rachel Slater & Maren Duvendack pages 445-455 Download full text
  • Assessing ‘what works’ in international development: meta-analysis for sophisticated dummies
    • Maren Duvendack, Jorge Garcia Hombrados, Richard Palmer-Jones & Hugh Waddington pages 456-471Download full text
  • The impact of daycare programmes on child health, nutrition and development in developing countries: a systematic review

New journal on Systematic Reviews

from BioMed Central Blog, thanks to tweet by @bengoldacre

“Systematic Reviews, a new journal in the BioMed Central portfolio, launches today. The journal, headed by Editors-in-Chief David Moher, Lesley Stewart and Paul Shekelle, aims to encompass all aspects of the design, conduct and reporting of systematic reviews.

As the first open access journal to focus on systematic reviews and associated literature, Systematic Reviews aims to publish high quality systematic review products including systematic review protocols, systematic reviews related to a very broad definition of health, rapid reviews, updates of already completed systematic reviews, and methods research related to the science of systematic reviews, such as decision modeling. The journal also aims to ensure that the results of all well-conducted systematic reviews are published, regardless of their outcome.

The journal supports innovation and transparency in the reporting of systematic reviews. In a thematic series published upon launch, six articles explore the importance of registering systematic reviews and review protocols, including a commentary from the Chief Medical Officer for the UK, Prof Dame Sally Davies, who writes on the value of registering  reviews from a funder’s perspective.

With the launch of Systematic Reviews, the Editors-in-Chief note that ‘The explosion in the number of systematic reviews being published across a range of disciplines  demonstrates widespread interest in a broad range of systematic review activities and products. Beyond the Cochrane Library there is no journal singularly devoted to all things systematic review. We hope Systematic Reviews will become that journal and that its open access status will attract authors and readers globally.’

The journal will provide an important addition to medical research, in promoting systematic reviews as an important means of analysing and assessing trial outcomes, and developing responses to failing approaches in healthcare treatments and research. The journal has already garnered support from the medical community, with Dr Ben Goldacre, author, journalist and research fellow at the London School of Hygiene and Tropical Medicine stating: ‘Medicine cannot depend on meandering essays, presenting an incomplete or inconsistent view of the scientific literature: to understand whether treatments work or not, we need complete summaries – collating all the evidence – using clearly explained methods to track it down. Systematic reviews are the key, and yet this tool is surprisingly new in medical science. At a time of rising concern about biased under-reporting of negative results, it’s good to see a new open access journal devoted to improving the science of systematic reviews.’

As the Editors-in-Chief note in their launch editorial, ‘individual studies are seldom sufficient to drive change. They are often too small to reach reliable conclusions, and for fair evaluation, it is important to look at the totality (or at least an unbiased sample of the totality) of evidence in favour of, against, or neutral to the healthcare intervention under consideration.’ Systematic Reviews aims to provide the platform for such evaluation, and in doing so, contribute to the wider development and improvement of healthcare.”

RD Comment: These developments are relevant to aid agencies who are commissioning synthesis type studies of large fields of work, such as governance and accountability or livelihoods (both done by DFID recently), and to the evaluators considering this work. And…its great to see that this is an Open Access journal. Well done.

The initial issue is worth scanning, especially the Editorial on the topic of Why prospective registration of systematic reviews makes sense See also: Evidence summaries: the evolution of a rapid review approach

There is more material on the use of systematic reviews re development aid interventions on the 3ie website

Five challenges facing impact evaluation

PS 2018 02 23: The original NONIE Meeting 2001 website is no longer in existence. Use this reference, if needed: White, H. (2011) ‘Five challenges facing impact evaluation on NONIE’ (http://nonie2011.org/?q=content/post-2).

“There has been enormous progress in impact evaluation of development interventions in the last five years. The 2006 CGD report When Will be Ever Learn? claimed that there was little rigorous evidence of what works in development. But there has been a huge surge in studies since then. By our count, there are over 800 completed and on-going impact evaluations of socio-economic development interventions in low and middle-income countries.

But this increase in numbers is just the start of the process of ‘improving lives through impact evaluation’, which was the sub-title of the CGD report and has become 3ie’s vision statement. Here are five major challenges facing the impact evaluation community:

1. Identify and strengthen processes to ensure that evidence is used in policy: studies are not an end in themselves, but a means to the end of better policy, programs and projects, and so better lives. At 3ie we are starting to document cases in which impact evaluations have, and have not, influenced policy to better understand how to go about this. DFID now requires evidence to be provided to justify providing support to new programs, an example which could be followed by other agencies.

2. Institutionalize impact evaluation: the development community is very prone to faddism. Impact evaluation could go the way of other fads and fall into disfavour. We need to demonstrate the usefulness of impact evaluation to help prevent this happening , hence my first point. But we also need take steps to institutionalize the use of evidence in governments and development agencies. This step includes ensuring that ‘results’ are measured by impact, not outcome monitoring.

3. Improve evaluation designs to answer policy-relevant questions: quality impact evaluations embed the counterfactual analysis of attribution in a broader analysis of the causal chain, allowing an understanding of why interventions work, or not, and yielding policy relevant messages for better design and implementation. There have been steps in this direction, but researchers need better understanding of the approach and to genuinely embrace mixed methods in a meaningful way.

4. Make progress with small n impact evaluations: we all accept that we should be issues-led not methods led, and use the most appropriate method for the evaluation questions at hand. But the fact is that there is far more consensus for the evaluation of large n interventions, in which experimental and quasi-experimental approaches can be used, then there is about the approach to be used for small n interventions. If the call to base development spending on evidence of what works is to be heeded, then the development evaluation community needs to move to consensus on this point.

5. Expand knowledge and use of systematic reviews: single impact studies will also be subject to criticisms of weak external validity. Systematic reviews, which draw together evidence from all quality impact studies of a particular intervention in a rigorous manner, give stronger, more reliable, messages. There has been an escalation in the production of systematic reviews in development in the last year. The challenge is to ensure that these studies are policy relevant and used by policy makers.”

AusAID-DFID-3ie call for Systematic Reviews

The Australian Agency for International Development (AusAID), the UK’s Department for International Development (DFID) and the International Initiative for Impact Evaluation (3ie) have just launched a joint call for proposals for systematic reviews to strengthen the international community’s capacity for evidence-based policy making. AusAID, DFID and 3ie have identified around 59 priority systematic review questions across several themes: education; health; social protection and social inclusion; governance, fragile states, conflict and disasters; environment; infrastructure and technology; agriculture and rural development; economic development; and aid delivery and effectiveness.

Systematic reviews examine the existing evidence on a particular intervention or program in low and middle income countries, drawing also on evidence from developed countries when pertinent. The studies should be carried out according to recognized international standards and guidelines. All studies will be subject to an external review process and for this purpose teams will be encouraged to register for peer review with a relevant systematic review coordinating body.

Applications have to be submitted using 3ie’s online application system. Deadline for submission of applications is 9am GMT on Monday, November 29, 2010.

For information on how to apply, guidance documents and the call for proposals, go to http://www.3ieimpact.org/systematicreviews/3ie-ausaid-dfid.php

%d bloggers like this: