A move to more systematic and transparent approaches in qualitative evidence synthesis

An update on a review of published papers.
By Karin Hannes and Kirsten Macaitis  Qualitative Research 2012 12: 402 originally published online 11 May 2012

Abstract

In 2007, the journal Qualitative Research published a review on qualitative evidence syntheses conducted between 1988 and 2004. It reported on the lack of explicit detail regarding methods for searching, appraisal and synthesis, and a lack of emerging consensus on these issues. We present an update of this review for the period 2005–8. Not only has the amount of published qualitative evidence syntheses doubled, but authors have also become more transparent about their searching and critical appraisal procedures. Nevertheless, for the synthesis component of the qualitative reviews, a black box remains between what people claim to use as a synthesis approach and what is actually done in practice. A detailed evaluation of how well authors master their chosen approach could provide important information for developers of particular methods, who seem to succeed in playing the game according to the rules. Clear methodological instructions need to be developed to assist others in applying these synthesis methods.

New Directions for Evaluation: Promoting Valuation in the Public Interest: Informing Policies for Judging Value in Evaluation

Spring 2012, Volume 2012, Issue 133, Pages 1–129 Buy here

Editor’s Notes – George Julnes

  1. Editor’s notes (pages 1–2) Abstract PDF(22K)

Research Articles

  1. Managing valuation (pages 3–15)  George JulnesAbstract PDF(77K) References
    >
  2. The logic of valuing (pages 17–28)  Michael Scriven Abstract  PDF(63K) References
  3. The evaluator’s role in valuing: Who and with whom (pages 29–41)Marvin C. Alkin, Anne T. Vo and Christina A. Christie Abstract PDF(74K) References
  4. Step arounds for common pitfalls when valuing resources used versus resources produced (pages 43–52)Brian T. Yates Abstract PDF(60K) References
  5. When one must go: The Canadian experience with strategic review and judging program value (pages 65–75)François Dumaine Abstract  PDF(59K) References
  6. Valuing, evaluation methods, and the politicization of the evaluation process (pages 77–83)Eleanor Chelimsky Abstract PDF(46K) References
  7. Valuation and the American Evaluation Association: Helping 100 flowers bloom, or at least be understood? (pages 85–90)Michael Morris Abstract PDF(40K) References

“Six Years of Lessons Learned in Monitoring and Evaluating Online Discussion Forums”

by Megan Avila, Kavitha Nallathambi, Catherine Richey, Lisa Mwaikambo– in Knowledge Management & E-Learning: An International Journal (KM&EL), Vol 3, No 4 (2011)

….which looks at how to evaluate virtual discussion forums held on the IBP (Implementing Best Practices in Reproductive Health) Knowledge Gateway – a platform for global health practitioners to exchange evidence-based information and knowledge to inform practice. Available as pdf  Found courtesy of Yaso Kunaratnam, IDS

Abstract: “This paper presents the plan for evaluating virtual discussion forums held on the Implementing Best Practices in Reproductive Health (IBP) Knowledge Gateway, and its evolution over six years. Since 2005, the World Health Organization Department of Reproductive Health and Research (WHO/RHR), the Knowledge for Health (K4Health) Project based at Johns Hopkins Bloomberg School of Public Health’s Center for Communication Programs (JHU?CCP), and partners of the IBP Initiative have supported more than 50 virtual discussion forums on the IBP Knowledge Gateway. These discussions have provided global health practitioners with a platform to exchange evidence-based information and knowledge with colleagues working around the world. In this paper, the authors discuss challenges related to evaluating virtual discussions and present their evaluation plan for virtual discussions. The evaluation plan included the following three stages: (I) determining value of the discussion forums, (II) in-depth exploration of the data, and (III) reflection and next steps and was guided by the “Conceptual Framework for Monitoring and Evaluating Health Information Products and Services” which was published as part of the Guide to Monitoring and Evaluation of Health Information Products and Services. An analysis of data from 26 forums is presented and discussed in light of this framework. The paper also includes next steps for improving the evaluation of future virtual discussions.”

 

What shapes research impact on policy?

…Understanding research uptake in sexual and reproductive health policy processes in resource poor contexts

Andy Sumner, Jo Crichton, Sally Theobald, Eliya Zulu and Justin Parkhurst. Health Research Policy and Systems 2011, 9(Suppl 1):S3 Published: 16 June 2011

Abstract “Assessing the impact that research evidence has on policy is complex. It involves consideration of conceptual issues of what determines research impact and policy change. There are also a range of methodological issues relating to the question of attribution and the counter-factual. The dynamics of SRH, HIV and AIDS, like many policy arenas, are partly generic and partly issue- and context-specific. Against this background, this article reviews some of the main conceptualisations of research impact on policy, including generic determinants of research impact identified across a range of settings, as well as the specificities of SRH in particular. We find that there is scope for greater cross-fertilisation of concepts, models and experiences between public health researchers and political scientists working in international development and research impact evaluation. We identify aspects of the policy landscape and drivers of policy change commonly occurring across multiple sectors and studies to create a framework that researchers can use to examine the influences on research uptake in specific settings, in order to guide attempts to ensure uptake of their findings. This framework has the advantage that distinguishes between pre-existing factors influencing uptake and the ways in which researchers can actively influence the policy landscape and promote research uptake through their policy engagement actions and strategies. We apply this framework to examples from the case study papers in this supplement, with specific discussion about the dynamics of SRH policy processes in resource poor contexts. We conclude by highlighting the need for continued multi-sectoral work on understanding and measuring research uptake and for prospective approaches to receive greater attention from policy analysts.”