Conference: Next Generation Evaluation: Embracing Complexity, Connectivity and Change

Posted on Friday, March 21st, 2014

“On Nov. 14th 2013, FSG and Stanford Social Innovation Review convened Next Generation Evaluation: Embracing Complexity, Connectivity and Change to discuss emerging ideas that are defining the future of social sector evaluation. The Conference brought together nearly 400 participants to learn about the trends driving the need for evaluation to evolve, the characteristics and approaches that represent Next Generation Evaluation, and potential implications for the social sector.”

The conference website provides 8 video records of presentations and pdf copies of many more.

Introducing Next Generation Evaluation, Hallie Preskill, Managing Director, FSG  Introducing Next Generation Evaluation

Developmental Evaluation: An Approach to Evaluating Complex Social Change Initiatives, Kathy Brennan, Research and Evaluation Advisor, AARP  Developmental Evaluation: An Approach to Evaluating Complex Social Change Initiatives

Shared Measurement: A Catalyst to Drive Collective Learning and Action , Patricia Bowie, Consultant, Magnolia Place Community Initiative,  Shared Measurement: A Catalyst to Drive Collective Learning and Action

Using Data for Good: The Potential and Peril of Big Data, Lucy Bernholz, Visiting Scholar, Center on Philanthropy and Civil Society, Stanford University, Using Data for Good: The Potential and Peril of Big Data

Frontiers of Innovation: A Case Study in Using Developmental Evaluation to Improve Outcomes for Vulnerable Children, James Radner, Assistant Professor, University of Toronto,  Frontiers of Innovation: A Case Study in Using Developmental Evaluation to Improve Outcomes for Vulnerable Children

Project SAM: A Case Study in Shared Performance Measurement For Community Impact , Sally Clifford, Program Director, Experience Matters
Tony Banegas, Philanthropy Advisor, Arizona Community Foundation, Project SAM: A Case Study in Shared Performance Measurement For Community Impact

UN Global Pulse: A Case Study in Leveraging Big Data for Global Development, Robert Kirkpatrick, Director, UN Global Pulse, UN Global Pulse: A Case Study in Leveraging Big Data for Global Development

Panel: Implications for the Social Sector (“So What?”), Presenters: Lisbeth Schorr, Senior Fellow, Center for the Study of Social Policy; Fay Twersky, Director, Effective Philanthropy Group, The William and Flora Hewlett Foundation; Alicia Grunow, Senior Managing Partner, Design, Development, and Improvement Research, Carnegie Foundation for the Advancement of Teaching Moderator: Srik Gopalakrishnan, Director, FSG

Small Group Discussion: Implications for Individuals and Organizations (“Now What?”) , Moderator: Eva Nico, Director, FSG, Embracing Complexity, Connectivity, and Change , Brenda Zimmerman, Professor, York University,  Embracing Complexity, Connectivity, and Change

VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

DCED Global Seminar on Results Measurement 24-26 March 2014, Bangkok

Posted on Monday, January 6th, 2014

Full text available here:

“Following popular demand, the DCED is organising the second Global Seminar on results measurement in the field of private sector development (PSD), 24-26 March 2014 in Bangkok, Thailand. The Seminar is being organised in cooperation with the ILO and with financial support from the Swiss State Secretariat for Economic Affairs (SECO). It will have a similar format to the DCED Global Seminar in 2012, which was attended by 100 participants from 54 different organisations, field programmes and governments.

Since 2012, programmes and agencies have been adopting the DCED Standard for results measurement in increasing numbers; recently, several have published the reports of their DCED audit. This Seminar will explore what is currently known, and what we need to know; specifically, the 2014 Seminar is likely to be structured as follows:

  • An introduction to the DCED, its Results Measurement Working Group, the DCED Standard for results measurement and the Standard audit system
  • Insights from 10 programmes experienced with the Standard, based in Bangladesh, Cambodia, Fiji, Georgia, Kenya, Nepal, Nigeria and elsewhere (further details to come)
  • Perspectives from development agencies on results measurement
  • Cross cutting issues, such as the interface between the Standard and evaluation, measuring systemic change, and using results in decision-making
  • A review of the next steps in learning, guidance and experience around the Standard
  • Further opportunities for participants to meet each other, learn about each others’ programmes and make contacts for later follow-up

You are invited to join the Seminar as a participant. Download the registration form here, and send to There is a fee of $600 for those accepted for participation, and all participants must pay their own travel, accommodation and insurance costs. Early registration is advised.”

VN:F [1.9.22_1171]
Rating: -1 (from 1 vote)

INTRAC M&E Workshop: Practical Responses to Current Monitoring and Evaluation Debates

Posted on Monday, August 5th, 2013

A one-day workshop for M&E practitioners, civil society organisations and development agencies to debate and share their experiences. There is an increased pressure on NGOs to improve their M&E systems and often to move out of their methodological comfort zone to meet new requirements from donors and stakeholders. This event will examine the challenges faced by NGOs and their responses around four themes:

  •  Designing and using baselines for complex programmes
  • Using Information and Communications Technology (ICT) in M&E
  • Experimental and quasi-experimental methods in M&E, including randomised control trials
  • M&E of advocacy

Download the overview paper.

Call for M&E case studies

We are looking for short case studies focusing on one or more of the four event themes (see above). The case studies will be shared and will form the basis of the discussions at the workshop.

*Deadline for abstracts (max. 500 words): Friday 13 September 2013*

Please email abstracts to

Event bookings

Event cost: £80 (£60 early bird booking before 19 October 2013)

Please return the booking form to


VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

ICAI Seeks Views on Revised Evaluation Framework

Posted on Wednesday, May 1st, 2013


 ”In our first report, ICAI’s Approach to Effectiveness and Value for Money,we set out an evaluation framework, consisting of 22 questions under 4 guiding criteria (objectives, delivery, impact and learning), to guide our lines of enquiry in reviews. In the light of our experience to date in carrying out our reports, we have reviewed this framework. The revised framework is available at this link: ICAI revised evaluation framework

We are now entering a period of consultation on the revised framework which will run until 24 May 2013. If you have any comments or views, please email  or post them to: The Secretariat, Independent Commission for Aid Impact, Dover House, 66 Whitehall, London SW1A 2AU”

VN:F [1.9.22_1171]
Rating: -1 (from 1 vote)

Open consultation Triennial review of the Independent Commission for Aid Impact (ICAI)

Posted on Tuesday, April 16th, 2013

(Website that hosts the text below)

This consultation closes on 26 April 2013

On 21 March the Government announced the triennial review of the Independent Commission for Aid Impact (ICAI) and is seeking views of stakeholders who wish to contribute to the Review. Triennial Reviews of Non-Departmental Public Bodies (NDPBs) are part of the Government’s commitment to review all NDPBs, with the aim of increasing accountability for actions carried out on behalf of the State.

The ICAI’s strategic aim is to provide independent scrutiny of UK aid spending, to promote the delivery of value for money for British taxpayers and to maximise the impact of aid.

The Review will be conducted in line with Cabinet Office principles and guidance, in two stages.

The first stage will:

  • Identify and examine the key functions of the ICAI and assess how these functions contribute to the core business of DFID;
  • Assess the requirement for these functions to continue given other scrutiny processes;
  • If continuing, assess how the key functions might best be delivered; if one of these options is continuing delivery through the ICAI, then make an assessment against the Government’s “three tests”: technical function; political impartiality; and the need for independence from Ministers.

If the outcome of stage one is that delivery should continue through the ICAI, the second stage of the review will:

  • Review whether ICAI is operating in line with the recognised principles of good corporate governance, using the Cabinet Office “comply or explain” standard approach.

In support of these aims we would welcome input and evidence from stakeholders, focused on these main questions:

ICAI’s functions

For the purposes of this review, we have defined ICAI’s key functions as follows:

  • Produce a wide range of independent, high quality/professionally credible and accessible reports (including evaluations, VfM reviews, investigations) setting out evidence of the impact and value for money of UK development efforts;
  • Work with and for Parliament to help hold the UK Government to account for its development programme, and make information on this programme available to the public;
  • Produce appropriately targeted recommendations to be implemented/ followed up by HMG.

Which of these functions do you think are still needed? What would be the impact if ICAI ceased to exist?

Would you define ICAI’s functions differently?

Do you think any of the following delivery mechanisms would be more appropriate or cost effective at delivering these functions: Local government, the voluntary sector, private sector, another existing body or DFID itself?

To date, do you think ICAI has focused on scrutinising UK aid spend or the wider HMG development effort? What do you think it should be doing?

Where do you think ICAI sits on the spectrum between audit and research? Is this where they should be?

How far can and should ICAI have a role in holding HMG to account?

Production of reports

What is the quality of ICAI reports? Is the expertise of those producing the reports appropriate? How does this compare to other scrutiny bodies that you know of?

How far does the methodology used by ICAI add value to other scrutiny of DFID programmes (eg IDC, NAO, DFID internal)?

How far does ICAI involve beneficiaries in its work?

What impact have ICAI reviews had on DFID staff and resources?

How independent do you believe ICAI is? How important do you think this independence is for ICAI’s ability to deliver its functions effectively?

How much of an impact do you think the Commissioners have on ICAI recommendations and reports? What added value do you think they bring? Do they have the right skillset?

Making information available to the IDC and the public

How important do you think ICAI’s role is in providing information about UK development to taxpayers?

What impact has ICAI had on public perceptions of UK development?

Production of targeted recommendations

What has been the added value of ICAI’s recommendations? How do these compare to other scrutiny bodies that you know of?

How far and why have recommendations been followed up?

What impact has ICAI had on DFID’s own approach to monitoring impact and value for money?

How far has ICAI promoted lesson learning in DFID?


Do you think ICAI could improve? If so, how do you think ICAI could improve?

Do you have any other comments?

The government is seeking views of stakeholders on the Triennial Review of the Independent Commission for Aid Impact (ICAI).

Contact us by 26 April 2013

Write to us:

ICAI Review Team KB 2.2
22 Whitehall
VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

“Evaluation and Inequality: Moving Beyond the Discussion of Poverty” International Development Evaluation Association Global Assembly

Posted on Monday, November 19th, 2012

Bridgetown, Barbados (May 6-9, 2013)

IDEAS website


The Board of the International Development Evaluation Association (IDEAS) is pleased to announce its next Global Assembly on May 7-9, 2013 in Bridgetown, Barbados, preceded by professional development training sessions on May 6. The theme of the Assembly will be on the relation of evaluation and inequality and their influence on development. The theme of this coming assembly underscores the role that evaluative knowledge can play in development in general and more particularly in the focus on the sustaining factors that generate and perpetuate poverty.

Assembly Agenda and Call for Paper/Panel Proposals:

The Assembly will organize itself into a number of substantive strands. Each of these strands will be discussed here. Potential presenters are invited to make a proposal for a paper or panel in one or more of these strands. General paper proposals on topics of evaluation outside the theme of the strands are also invited. We especially invite papers that are grounded in development experiences.

Strand One: Understanding Inequality and its relation to the causes and consequences of poverty

Strand Two: Effective program strategies to address inequality—findings from evaluation

Strand Three: Regional responses/regional strategies to address inequality

Strand Four: The measurement and assessment of inequality

Strand Five: General Paper Sessions—all other papers/panels being proposed on any evaluation topic

All paper/panel proposals should be sent by January 10, 2013 to: Ray C. Rist, President of IDEAS, at the following e-mail address:

Proposal Guidelines:

1) Each paper or panel proposal can be no more than 250 words in total. This proposal should include the title, name (s) of participants, affiliation of participants; and brief description of the subject of the paper/panel.

2) The date for submission of all proposals is January 10, 2013!!

3) Consideration of any proposal after January 10 is at the full discretion of the chair.

4) Decisions on all proposals will be made within two weeks and presenters will be informed immediately.


Scholarships: There will be some few scholarships available to ensure a global representation of development evaluators at this Assembly. First priority for scholarships will be for current IDEAS Members who present a paper/panel or are actively involved in the Assembly as a panel chair or discussant.

NOTE: Anyone who wishes to present at this Assembly will have to be a present member of IDEAS.

VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

Free relevant well organised online courses: Statistics, Model Thinking and others

Posted on Wednesday, October 24th, 2012

Provided FREE by Coursera in cooperation with Princeton, Stanford and other Universities

Each opening page gives this information: about the Course, About the Instructor, The Course Sylabus, Introductory Video, Recommended Background, Suggested Readings, Course Format, FAQs,

Example class format: “Each week of class consists of multiple 8-15 minute long lecture videos, integrated weekly quizzes, readings, an optional assignment and a discussion. Most weeks will also have a peer reviewed assignment, and there will be the opportunity to participate in a community wiki-project. There will be a comprehensive exam at the end of the course.”

The contents of past courses remain accessible.

RD Comment: Highly Recomended! [ I am doing the stats course this week]

VN:F [1.9.22_1171]
Rating: +2 (from 2 votes)

Evaluating the impact of aid to Africa: lessons from the Millennium Villages

Posted on Wednesday, June 13th, 2012

3 July 2012 17:00-18:30 (GMT+01 (BST)) – Public event, Overseas Development Institute and screened live online

Register to attend

“At the turn of the century, Jeffrey Sachs of Columbia University, in partnership with the United Nations, established integrated rural development projects, known as Millennium Villages in ten African countries. When they came to be evaluated in 2011, an intense row broke out between development experts about their impact and sustainability.

ODI and the Royal Africa Society are delighted to host Michael Clemens who will argue that aid projects in Africa need much more careful impact evaluations that are transparent, rigorous, and cost-effective. Our panel of experts will also discuss the Millenium Villages project within the wider context of international aid to Africa, analysing other development models and questioning the impact of each one.”

VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

AEA Conference: Evaluation in Complex Ecologies

Posted on Wednesday, February 22nd, 2012

Relationships, Responsibilities, Relevance
26th Annual Conference of the American Evaluation Association
Minneapolis, Minnesota, USA
Conference: October 24-27, 2012
Workshops: October 22, 23, 24, 28

“Evaluation takes place in complex global and local ecologies where we evaluators play important roles in building better organizations and communities and in creating opportunities for a better world. Ecology is about how systems work, engage, intersect, transform, and interrelate. Complex ecologies are comprised of relationships, responsibilities, and relevance within our study of programs, policies, projects, and other areas in which we carry out evaluations.

Relationships. Concern for relationships obliges evaluators to consider questions such as: what key interactions, variables, or stakeholders do we need to attend to (or not) in an evaluation? Evaluations do not exist in a vacuum disconnected from issues, tensions, and historic and contextualized realities, systems, and power dynamics. Evaluators who are aware of the complex ecologies in which we work attend to relationships to identify new questions and to pursue new answers. Other questions we may pursue include:

  • Whose interests and what decisions and relationships are driving the evaluation context?
  • How can evaluators attend to important interactions amidst competing interests and values through innovative methodologies, procedures, and processes?

Responsibilities. Attention to responsibilities requires evaluators to consider questions such as: what responsibilities, inclusive of and beyond the technical, do we evaluators have in carrying out our evaluations? Evaluators do not ignore the diversity of general and public interests and values in evaluation. Evaluations in complex ecologies make aware ethical and professional obligations and understandings between parties who seek to frame questions and insights that challenge them. Other questions we may pursue include:

  • How can evaluators ensure their work is responsive, responsible, ethical, equitable, and/or transparent for stakeholders and key users of evaluations?
  • In what ways might evaluation design, implementation, and utilization be responsible to issues pertinent to our general and social welfare?

Relevance. A focus on relevance leads to evaluations that consider questions such as: what relevance do our evaluations have in complex social, environmental, fiscal, institutional, and/or programmatic ecologies? Evaluators do not have the luxury of ignoring use, meaning, of sustainability; instead all evaluations require continual review of purposes, evaluands, outcomes, and other matters relevant to products, projects, programs, and policies. Other questions we may pursue include:

  • How can evaluators ensure that their decisions, findings, and insights are meaningful to diverse communities, contexts, and cultures?
  • What strategies exist for evaluators, especially considering our transdisciplinary backgrounds, to convey relevant evaluation processes, practices, and procedures?

Consider this an invitation to submit a proposal for Evaluation 2012 and join us in Minneapolis as we consider evaluation in complex ecologies where relationships, responsibilities, and/or relevance are key issues to address.”


VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)

Conference: Measuring Impact of Higher Education for Development

Posted on Thursday, February 16th, 2012

From: Monday 19th March 2012 to Tuesday 20th March 2012

Venue:  Birkbeck College, Malet Street, London

Organisers: London International Development Centre (LIDC); Association of Commonwealth Universities (ACU)

Background: Higher education for international development has been, in recent years, a neglected area relative to other educational interventions. Yet higher education (HE) is necessary for the attainment of Millennium Development Goals (MDGs) and for economic development in low and middle income countries.

There is a long history of development assistance interventions in HE to support development goals, directed at strengthening individual, organisational and institutional capacity. These have included scholarship programmes as well as support to specific universities and university networks in low and middle income countries, and support to academic research and training partnerships.
However, there has been little comparison of these different interventions in terms of their international development impact. This limits our understanding of “what works” in HE interventions for development, and our ability to invest effectively in future.
The aim of this two-day international conference is to examine the current status of impact evaluation for HE interventions and to identify research gaps and needs for the future. The conference will focus on three issues:
  • What has been, and should be, the development intention of HE interventions?
  • How should development impacts be measured?
  • What is our experience with measurement methods and tools to date, where are the gaps and what research priorities emerge?

The programme will be posted online soon.

Who should attend:

The conference will bring together experts from three research sectors: higher education, international development and impact evaluation from academia, think tanks, government agencies and civil society organisations. PhD students are welcome if their research is relevant to the theme of the conference.

Registration is open between 2 February and 5 March 2012.
To register, please fill in and return the registration form.
Attendance is free of charge.

VN:F [1.9.22_1171]
Rating: 0 (from 0 votes)