In this posting I am drawing attention to a blog by Michaela Raab and Wolf Stuppert, which is exceptional (or at least unusual) in a number of respects. The blog is called http://www.evawreview.de/
Firstly the blog is not just about the results of a review, but more importantly, about the review process, written as the review process proceeds. (I have not seen many of these kinds of blogs around, but if you know about any others please let me know)
Secondly the blog is about the use of of QCA and process tracing. There have been a number of articles about QCA in the journal Evaluation but generally speaking relatively few evaluators working with development projects know much about QCA or process tracing.
Thirdly, the blog is about the use of QCA and process tracing as a means of doing a review of findings of past evaluations of interventions related to violence against women and girls. In other words it is another approach to undertaking a kind of systematic review, notably one which does not require throwing out 95% of the available studies because their contents don’t fit the methodology being used to do the systematic review.
Fourthly, it is about combining the use of QCA and process tracing, i.e. combining cross-case comparisons with within-case analyses. QCA can help identify causal configurations of conditions associated with specific outcomes. But once found these associations need to be examined in depth to ensure there are plausible causal mechanisms at work. That is where process tracing comes into play.
I have two hopes for the EVAWG Review blog. One is that it will provide a sufficiently transparent account of the use of QCA to enable new potential users to understand how it works, along with an appreciation of its potentials and difficulties. The other is that the dataset used in the QCA analysis will be made publicly available, ideally via the blog itself. One of the merits of QCA analyses, as published so far, is that the datasets are often published as part of the published articles, which means others can then re-analyse the same data, perhaps from a different perspective. For example, I would like to test the results of the QCA analyses by using another method for generating results which have a comparable structure (i.e. descriptions of one or more configurations of conditions associated with the presence and absence of expected outcomes). I have described this method elsewhere (Decision Tree algorithms, as used in data mining)
There are also some challenges that will face this use of QCA, which I would like to see how the blog’s authors will try to deal with. In RCTs there need to be both comparable interventions and comparable outcomes e.g. cash transfers provided to many people in some standardised manner, and a common measure of household poverty status. With QCA (and Decision Tree) analyses comparable outcomes are still needed, but not comparable interventions. These can be many and varied, as can be the wider context in which they are provided. The challenge with Raab and Stuppert’s work on VAWG is that there will be many and varied outcome measures as well and interventions. They will probably need to do multiple QCA analyses, focusing on sub-sets of evaluations within which there are one or more comparable outcomes. But by focusing in this way, they may end up with too few cases (evaluations) to produce plausible results, given the diversity of (possibly) causal conditions they will be exploring.
There is a much bigger challenge still. On re-reading the blog I realised this is not simply a kind of systematic review of the available evidence, using a different method. Instead it is a kind of meta-evaluation, where the focus is on comparison of the evaluation methods used in the population of evaluation they manage to amass. The problem of finding comparable outcomes is much bigger here. For example, on what basis will they rate or categorise evaluations as successful (e.g. valid and/or useful)? There seems to be a chicken and egg problem lurking here. Help!
PS1: I should add that this work is being funded by DFID, but the types of evaluations being reviewed is not limited to evaluations of DFID projects
PS2 2013 11 07 : I now see from the team’s latest blog posting the the common outcome of interest will be the usefullness of the evaluation. I would be interested to see how they assess usefullness , in some way that is reasonably reliable.
PS3 2014 01 07: I continue to be impressed by the team’s efforts to publicly document the progress of their work. Their Scoping Report is now available online, along with a blog commentary on progress to date (2013 01 06)
PS4 2014 03 27: The Inception Report is now available on the VAWG blog. It is well worth reading, especially the sections explaining the methodology and the evaluation team’s response to comments by the the Specialised Evaluation and Quality Assurance Service (SEQUAS, 4 March 2014) on pages 56-62, some of which are quite tough.
Some related/relevant reading:
- Compasss: A web site devoted to resources on QCA
- Goertz, Gary, and James Mahoney. A Tale of Two Cultures: Qualitative and Quantitative Research in the Social Sciences. Princeton University Press, 2012.
- Mahoney, James. “Mahoney, J. (2012). The Logic of Process Tracing Tests in the Social Sciences. 1-28.” Sociological Methods & Research XX(X) (March 2, 2012): 1–28. doi:10.1177/0049124112437709.
- Where there is no single Theory of Change: The uses of Decision Tree models. Rick Davies, 2012
Itad is in the process of building an evaluation methodology for a multi country programme which includes process mapping and QCA, so looking at between-case and intra-case analysis. We hope to be able to release details once the client has approved this, and will look at blogging updates as we implement.
Many thanks for the post, which reflects our own hopes. And we are aware our project is an ambitious one!
Wolf and I will respond on http://www.evawreview.de in a couple of weeks’ time, when we’ll have completed the scoping phase, and recruited and trained coders for the first round of coding.
Meanwhile, I have detected two small typos: my real name is RAAB, and the blog is called http://www.evawreview.de
Cheers,
Hi Michaela
My apologies for the typos, which I have now corrected
Could you correct my name on your blog from Davis, to Davies?
regards, rick davies