DEMOCRACY, GOVERNANCE AND RANDOMISED MEDIA ASSISTANCE

BY DEVRA C. MOEHLER, BBC Media Action RESEARCH REPORT // ISSUE 03 // MARCH 2014 // GOVERNANCE. Available as pdf

Foreword by BBC Media Action

“This report summarises how experimental design has been used to assess the effectiveness of governance interventions and to understand the effects of the media on political opinion and behaviour. It provides an analysis of the benefits and drawbacks of experimental approaches and also highlights how field experiments can challenge the assumptions made by media support organisations about the role of the media in different countries.

The report highlights that – despite interest in the use of RCTs to assess governance outcomes – only a small number of field experiments have been conducted in the area of media, governance and democracy.

The results of these experiments are not widely known among donors or implementers. This report aims to address that gap. It shows that media initiatives have led to governance outcomes including improved accountability. However, they have also at times had unexpected adverse effects.

The studies conducted to date have been confined to a small number of countries and the research questions posed were linked to specific intervention and governance outcomes. As a result, there is a limit to what policymakers and practitioners can infer. While this report highlights an opportunity for more experimental research, it also identifies that the complexity of media development can hinder the efficacy of experimental evaluation. It cautions that low?level interventions (eg those aimed at individuals as opposed to working at a national or organisational level) best lend themselves to experimentation. This could create incentives for researchers to undertake experimental research that answers questions focused on individual change rather than wider organisational and systemic change. For example, it would be relatively easy to assess whether a training course does or does not work. Researchers can randomise the journalists that were trained and assess the uptake and implementation of skills. However, it would be much harder to assess how capacity?building efforts affect a media house, its editorial values, content, audiences and media/state relations.

Designing such experiments will be challenging. The intention of this report is to start a conversation both within our own organisation and externally. As researchers we should be prepared to discover that experimentation may not be feasible or relevant for evaluation. In order to strengthen the evidence base, practitioners, researchers and donors need to agree which research questions can and should be answered using experimental research, and, in the absence of experimental research, to agree what constitutes good evidence.

BBC Media Action welcomes feedback on this report and all publications published under our Bridging Theory and Practice Research Dissemination Series.”

Contents
Introduction 5
Chapter 1: Background on DG field experiments 7
Chapter 2: Background on media development assistance and evaluation 9
Chapter 3: Current experiments and quasi?experimental studies on media in developing countries 11
Field experiments
Quasi experiments
Chapter 4: Challenges of conducting field experiments on media development 21
Level of intervention
Complexity of intervention
Research planning under ambiguity
Chapter 5: Challenges to learning from field experiments on media development 26
Chapter 6: Solutions and opportunities 29
Research in media scarce environments
Test assumptions about media effects
To investigate influences on media
References 33

Comments?

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: