Stakeholder analysis and social network analysis in natural resource management

Christina Prell, Klaus Hubacekb, Mark Reed, Department of Sociological Studies, University of Sheffield and  Sustainability Research Institute, School of Earth and Environment, University of Leeds, 2009

Full text here

Introduction

Many conservation initiatives fail because they pay inadequate attention to the interests and characteristics of stakeholders. (Grimble and Wellard, 1997). As a consequence, stakeholder analysis has gained increasing attention and is now integral to many participatory natural resource management initiatives (Mushove and Vogel, 2005). However, there are a number of important limitations to current methods for stakeholder analysis. For example, stakeholders are usually identified and categorized through a subjective assessment of their relative power, influence and legitimacy (Mitchell et al., 1997; Frooman, 1999). Although a wide variety of categorization schemes have emerged from the literature (such as primary and secondary (Clarkson, 1995), actors and those acted upon (Mitchell et al., 1997); strategic and moral (Goodpaster, 1991); and generic and specific (Carroll, 1989) methods have often overlooked the role communication networks can play in categorizing and understanding stakeholder relationships. Social network analysis (SNA) offers one solution to these limitations.

Environmental applications of SNA are just beginning to emerge, and so far have focused on understanding characteristics of social networks that increase the likelihood of collective action and successful natural resource management (Schneider et al., 2003; Tomkins and Adger, 2004; Newman and Dale, 2004; Bodin et al., 2006; Crona and Bodin, 2006). In this paper, we harness and expand upon this knowledge to inform stakeholder analysis for participatory natural resource management. By participatory natural resource management we mean a process that engages stakeholders on multiple levels of decision making and facilitates the formation and strengthening of relationships among stakeholders for mutual learning (Grimble and Wellard, 1997; Dougill et al., 2006; Stringer et al., 2006). To enhance stakeholder analysis, we use SNA to identify the role and influence of different stakeholders and categories of stakeholder according to their positions within the network. We do this using case study material from the Peak District National Park, UK.

Survey of Donor Approaches to Governance Assessment

Published by the  Organisation for Economic Cooperation and Development Development Assistance Committee (OECD-DAC),
Size: 34 pages (358 KB)
Full Text here

Executive Summary

Bilateral and multilateral development agencies have engaged intensively in assessing governance over the last decade. To explore opportunities for increased harmonization and alignment in this area, members of the OECD DAC’s GOVNET have commissioned a survey of donor approaches to governance assessments. The survey reported here focuses on general and thematic governance assessment approaches actually used by agencies.

The survey identified 11 agencies having 17 general assessment tools in use and 3 under development, while 6 agencies which presently have no own tools are developing these. 9 agencies reported having 13 thematic tools in use, 4 of these and 3 other agencies are developing new tools. The thematic tools category includes assessment tools related to conflict, human rights, corruption, and sector assessments, as well as tools which focus on particular themes (e.g. financial governance aspects).

Training in Most Significant Change Technique (MSC) in Oxford, UK

Date: 28-29th July 2009
Venue: Oxford, UK

MSC is a powerful tool for monitoring, evaluation and organisational learning. MSC goes beyond merely capturing and documenting participants’ stories of impact, to offering a means of engaging in effective dialogue about what you are achieving. Each story represents the storyteller’s interpretation of impact, which is then reviewed and discussed. The process offers an opportunity for a diverse range of stakeholders to enter into a dialogue about program intention, impact and ultimately future direction. MSC has much to offer your existing M&E framework being especially good at capturing that traditionally hard to capture information about what difference did you make in the hearts and minds of those your were targeting for benefit – but it has much to offer beyond merely reporting on outcomes! This two day training workshop provides an introduction to MSC which includes designing your own MSC process. Participants will be provided with experiential learning opportunities and examples of real applications of the technique. We will also share our experiences of adapting MSC for use in evaluation studies.

Where: Oxford, England (venue to be determined)

When: Tuesday 28th & Wednesday 29th of July Cost: £550

* To secure your enrolment download and return your registration form today: [url]http://www.clearhorizon.com.au/training-mentoring/training/training-courses/most-significant-change-uk/ [/url]

* Concession rates and multiple participant discounts available contact tracey@clearhorizon.com.au for more information

International Program for Development Evaluation Training (IPDET)

Date: June 8 – July 3
Venue: Carleton University in Ottawa Canada

On-line applications are now being accepted for IPDET 2009 (June 8 – July 3) at www.ipdet.org.

It is time to apply for one to four weeks at the International Program for Development Evaluation Training (IPDET). Now entering its 9th year, the program will be held from June 8th through July 3rd at Carleton University in Ottawa Canada. Weeks 1 and 2 are a graduate-level intensive applied core course on development evaluation. Weeks 3 and 4 feature a choice of 30 workshops which go in-depth on specific development evaluation topics. New this year are workshops on evaluating governance and using the theory of change model for evaluating environmental and social impacts. Visit the website for more information about the new workshops and instructors, as well as returning ones. Note that you must register on the IPDET website www.ipdet.org before you can log in and access the on-line application form. If you experience difficulties with the application process, contact Mary Dixon, the IPDET Registrar, at mary_dixon@carleton.ca. IPDET is a collaboration of the Independent Evaluation Group of the World Bank and Carleton University with the support of several donor organizations.

3ie news: Working paper series launched

The first two 3ie working papers are now available:

Working Paper No. 1, Reflections on some current debates in impact evaluation, by Howard White reviews some of the criticisms commonly leveled at quantitative approaches to impact evaluation, arguing that many are based on mis-conceptions; and

Working Paper No. 2, Better Evidence for a Better World, edited by Mark Lipsey and Eamonn Noonan (produced jointly with The Campbell Collaboration) reviews the need for, and uses of, evidence in various fields of social policy.

Live webcast event on Real World Evaluation with Jim Rugh and Michael Bamberger

Date: Tuesday, May 12th
Venue: Internet

UNICEF CEE/CIS, WHO/PAHO and DevInfo, in partnership with IDEAS (International Development Evaluation Association) and IOCE (International Organization for Cooperation in Evaluation), are pleased to announce the Live webcast event on Real World Evaluation with Jim Rugh and Michael Bamberger on Tuesday, May 12th, at 10 AM Washington time.

This Event is organized within the monthly Knowledge Sharing Events on Country-led M&E systems. It is free and open to interested people. This event will enable the sharing of good practices and lessons learned. Global-level speakers will contribute international perspectives. In addition to watching live presentations, you will have the option to ask questions and provide comments. You may attend virtually from your personal or work computer anywhere in the world. You just need a computer, an internet connection, a microphone and speakers.
Continue reading “Live webcast event on Real World Evaluation with Jim Rugh and Michael Bamberger”

What should be found within an M&E framework / plan?

I was asked this question by a client some time ago. After some thinking about something that I felt I should have already known, I drafted up a one page guidance note for my client. The contents of the note also benefited from a discussion about appropriate expectations about M&E frameworks with other M&E people on the MandE NEWS email list

I have attached the one page guidance note here: What should be found in an M&E Framework / Plan?

Please feel free to post your comments on this document below. And to suggest any other documents or websites where this topic is covered.

PS: 28 October 2011: This one-pager contains a summary of the proposed contents of an M&E Framework for a DFID project, prepared this year

PS: 12 February 2014: Benedictus Dwiagus Stepantoro has sent me this link to the DFAT (was AusAID) Monitoring and Evaluation standards that were updated in 2013. He points especially  to standard no.2 on Initiative M&E System there, and comments:

” I use it all the time as reference in checking the quality of M&E system in program/project/initiative, as I often receive 3-5 M&E System/Plan documents every year to be assessed.

 The main key feature for an M&E system there are:

 – Should have an ‘evaluability assessment’, as basis for developing the M&E system.

– Have clarity on program outcome, key output, approach/modality and the logic around them

– Have Evaluation Questions, or Performance Key Questions/Indicators

– Methodology/Tools – including baseline

– Should have sufficient resource (people with right expertise, fund for M&E activities.etc)

– Scheduling of M&E activities

– Costing/Budget allocation for M&E

– Clear responsibility

….People often shows me a logframe or a matrix of indicator and proudly state that their program have an “M&E System”,… But,…. For me, .. A logframe alone, is not an M&E System. A matrix of Indicators alone, is not an M&E system”

Outcomes monitoring and IT: Finding the best solution

Date: 21 May 2009
Venue: London

This exciting new conference is the first time charities will have the chance to meet a range of system providers offering resources that help with outcomes monitoring.

The conference will help you:
· learn how IT can help you track the difference you make, save time and cut costs
· understand more about the processes involved in implementing an outcomes-based IT system
· gain an overview of the range of solutions available
· identify specific IT systems that will help you measure the outcomes of your work.
Continue reading “Outcomes monitoring and IT: Finding the best solution”

Training: Monitoring and Evaluation for Results

Date: July 6-17, 2009
Venue: The World Bank Headquarters
1818 H Street, NW Washington, DC 20433

SPONSORS
World Bank Institute Evaluation Group (WBIEG)

TOPICS
Introduction to Monitoring and Evaluation
Logic Models and Evaluation Questions
Indicators and Measurement
Research Designs
Data Collection
Reconstructing Baseline Data
Sampling
Data Analysis
The Practice of Impact Evaluation
Reporting Results and Utilization of Evaluations
Managing Monitoring and Evaluation Functions
Continue reading “Training: Monitoring and Evaluation for Results”

Training: Monitoring and Evaluation for Results

Date: May 11-15, 2009
Venue: Hotel Africa in Tunis, Tunisia

SPONSORS
World Bank Institute Evaluation Group (WBIEG) and Joint Africa Institute (JAI)

TOPICS
Introduction to Monitoring and Evaluation
Logic Models and Evaluation Questions
Indicators and Measurement
Research Designs
Data Collection
Reconstructing Baseline Data
Sampling
Data Analysis
The Practice of Impact Evaluation
Reporting Results and Utilization of Evaluations
Managing Monitoring and Evaluation Functions
Continue reading “Training: Monitoring and Evaluation for Results”