Training: Evaluating the quality of humanitarian projects

Training course – ‘Evaluating the Quality of humanitarian projects, postponed to 19-23 May 2008, Plaisians, France
This 5-day course will take place in the training centre at Groupe URD’s headquarters. The course looks at evaluation (definition, phases, types…) with regard to the Quality of humanitarian action.
It is organised around a full case study based on real experiences in the field.
The course is aimed at Programme managers, Project managers, M&E managers, evaluators and all other positions in the humanitarian sector involved in project management.
Language : French.
Registration form
Contact

UKES Annual Conference 2008, October 2008

Date: 23-24 October 2008
Venue: The Bristol Marriott Royal Hotel

Changing Contexts for Evaluation and Professional Action

UKES Annual Conference 2008

23-24 October 2008

The Bristol Marriott Royal Hotel


Call for abstracts

The UKES Annual Conference 2008 will be held on 23-24 October at the Bristol Marriott Royal Hotel, preceded on 22 October by a programme of Training and Professional Development Workshops.

The closing date for call for abstracts is 23 May 2008.

Conference website

The European Evaluation Society Conference, Lisbon 2008

Date: 1-3 October 2008
Venue: Lisbon, Portugal

‘Building for the future: Evaluation in governance, development and progress’

Programme

The conference will comprise keynote speakers, plenary and parallel sessions, networking opportunities and social events.

The main programme will commence at 19.00 on Wednesday 1 October and close with a reception at 18.00 on Friday 3 October.

The conference will be preceded by a programme of Pre-Conference Training and Professional Development Workshops commencing on Tuesday Morning 30 September and closing at 16.30 hrs on Wednesday 1 October (4 half day sessions). Read more about how you can contribute to these sessions.

The programme will include the following keynote presentations:

Development Evaluation at a Crossroads?
Niels Dabelstein, Evaluation of the Paris Declaration, Danish Institute for International Studies, Denmark

Evaluation and policy implementation in multi-level governance.
Nicoletta Stame, Professor of Social Policy, University of Roma “La Sapienza”, Italy

Evaluating drug policies in the European Union: current situation and future directions
Dra. Maria Moreira, European Monitoring Centre for Drugs and Drug Addiction, Portugal

Making evaluation more useful for determining development effectiveness
Dr Vinod Thomas, Director-General, Independent Evaluation Group (IEG), World Bank Group, Washington DC, USA

Parallel Sessions

We received over 350 abstracts from participants from 49 countries world-wide. These submissions will be presented in themed parallel sessions of papers, symposia, roundtables, panel/debates, posters and other innovative formats. A full programme will be available well in advance of the conference from this website.

Parallel sessions will be grouped under the following strands:
1. Methodological choices and approaches
2. Ethical practice, evaluation standards and professional development
3. Evaluation and its role in policy implementation and programme intervention
4. Evaluation and building informal social capital
5. International evaluation and evaluation in developing countries
6. Evaluation and organisational development
7. Encouraging evaluation use

The May 2008 Edinburgh Evaluation Summer School

Date: May 26th – 28th
Venue: University of Edinburgh, Scotland

(from Eval-Sys email list)

Dear Colleagues

The Edinburgh Evaluation Summer School
(http://www.chs.med.ed.ac.uk/ruhbc/evaluation/summerschool2008) is now in
its 3rd year; the prior summer schools have attracted interest from
evaluators and planners from all disciplines. As in prior years, the Summer
School faculty will include world class evaluation scholars and practitioners. As
well as providing a platform of learning from world class evaluation scholars
and practitioners, the Summer School also fosters a community of learning
and gives participants a platform to raise questions and discuss topics
relevant to evaluation.

The Edinburgh Evaluation Summer School will this year take place on the
following dates:

May 19th – 21st: Evaluations and Inequalities
May 26th – 28th: Getting Real About Impact Evaluation

Further details on each class and how to secure your place on this
year’s courses can be found on the Summer School website.
http://www.chs.med.ed.ac.uk/ruhbc/evaluation/summerschool2008/index.html

This website will be updated regularly so please check periodically.
Don’t hesitate to reach us at RUHBC.SummerSchool@ed.ac.uk if
you have any questions.

Best wishes,

Sanjeev Sridharan

Head Evaluation Programme
RUHBC
University of Edinburgh
Teviot Place
Edinburgh, EH8 9AG
Scotland, UK

next Outcome Mapping training workshop in Europe

Date: April 22-24, 2008
Venue: Ede, The Netherlands

(from the OM email list)

Colleagues,
This is to let you know that MDF Training & Consulting will be hosting my next Outcome Mapping training workshop in Europe at its headquarters in Ede, The Netherlands, April 22-24, 2008. You can see the full description of this introductory course at: http://www.mdf.nl/index.php/page/85/outcome-mapping?mod[MDFCourseCalendarModule][item

]=95

If you have colleagues or partners who may be interested, they can register at:
http://www.mdf.nl/index.php/page/85

Cheers,
Terry

Quote:

Terry Smutylo
542 Fraser Ave.
Ottawa, Ontario
Canada, K2A 2R4

Tel(613)729-6844
Fax(613)288-8993
tsmutylo@magma.ca (tsmutylo@magma.ca)
tsmutylo@gmail.com (tsmutylo@gmail.com)

Monday Developments issue on NGO accountability

(via Niels Keijzer on the Pelikan email list)

The December 2007 issue of Monday Developments, a monthly magazine
published by InterAction (the largest coalition of NGOs in the United
States), explores key accountability issues for NGOs. Through various
angles, the issue looks into “(…) the conflicts organizations face with
scarce resources, demanding missions and the need to evaluate progress and
effectiveness.”

The articles include views on the topic from development donors, the
Humanitarian Accountability Project, the importance of listening for
accountability, implications for evaluation standards and practice,
downward accountability, …

You can download the magazine here:
http://www.interaction.org/files.cgi/6117_MDDec2007.pdf

Evaluation of Results Based Management at UNDP

(via Niels Keijzer on the Pelikan email list)
The United Nations Development Programme’s independent evaluation office
has recently examined the agency’s adoption of results based management
(RBM). I would like to refer to this evaluation in relation to our recent
exchanges around changing information needs, and the ‘missing middle’ due
to the emphasis on collecting information on outcomes and impact. The main
purpose of this evaluation was to examine the degree to which RBM has
fostered a results culture within the organization, enhanced capacity to
improve management decisions, and strengthened UNDP’s contribution to
development results.

The UNDP website summarises the evaluation results as follows:
“The evaluation concludes that UNDP is largely managing for outputs rather
than outcomes and that the linkages between outputs and intended outcomes
are not clearly articulated. The introduction of corporate systems and
tools, which have had some efficiency benefits, have not however,
strengthened the culture of results in the organization or improved
programmatic focus at the country level. The current approach of defining
and reporting against centrally defined outcomes tends to undermine UNDP’s
responsiveness and alignment to nationally defined outcomes and
priorities. The evaluation makes a number of recommendations to address
these and other challenges.”

Here is a link to a page on the UNDP website, where you can download the
report, the individual chapters, and the management response:
http://www.undp.org/eo/thematic/rbm.html

Regulatory Impact Analysis (RIA) Training Course

Date: 6-10 October
Venue: College of Europe, Bruges Campus, Belgium

Dear Colleague,

The College of Europe and Jacobs and Associates Europe invite you to participate in our 5-Day Regulatory Impact Analysis (RIA) Training Course on the principles, procedures, and methods of RIA. This practical, hands-on, course was given in March, and due to demand, will be offered two more times in 2008 — in June and October. The course, by the most experienced public policy and RIA trainers in Europe, is expressly designed for policy officials and executives who use RIA to improve policy results.

The course will benefit any official using RIA in environmental, social and economic fields as well as stakeholders such as business associations, NGOs and consultants who want to understand better how to use RIA constructively. The course is open for subscription worldwide and is presented in the historic city of Bruges, Belgium. A discount is offered for early registration.

Information on RIA Training Course

2008 DATES: 23-27 June and 6-10 October (each course is 5 full days)
LOCATION: College of Europe, Bruges Campus, Belgium
REGISTRATION : For more information and application form go to www.coleurope.eu/ria2008
COST:

  • €2,995 for early registration (includes housing and meals)
  • €3,495 for regular registration (includes housing and meals)

REGISTRATION DEADLINES:

Early registration for the June course runs until 11 May 2008.
Registration closes 1 June 2008.

Early registration for the October course runs until 10 August 2008.
Registration closes on 14 September 2008.

OPEN : World-wide (only 40 seats available per session)
LANGUAGE OF INSTRUCTION: English
COURSE OFFERED BY: College of Europe and Jacobs and Associates Europe

The College of Europe provides a wide range of professional training courses, workshops and tailor-made seminars on the European Union in general or on targeted issues. For more information, please visit:
www.coleurope.eu/training or contact Mrs. Annelies Deckmyn by email: adeckmyn@coleurop.be

Jacobs and Associates continues to offer its tailored RIA training courses on-site around the world, adapted to the client’s needs. To discuss an on-site RIA course, contact ria@regulatoryreform.com. For information on the full range of regulatory reform work by Jacobs and Associates, see http://www.regulatoryreform.com/.

Best wishes,
Marc
Scott Jacobs
Managing Director, Jacobs and Associates Europe

The Third High Level Forum on Aid Effectiveness (HLF 3)

Date: 2-4 September 2008
Venue: Accra, Ghana

The Third High Level Forum on Aid Effectiveness (HLF 3) will be hosted in Accra by the Government of Ghana on 2-4 September 2008. The HLF 3 builds on several previous high level international meetings, most notably the 2003 Rome HLF which highlighted the issue of harmonisation and alignment, and the 2005 Paris HLF which culminated with the endorsement of the Paris Declaration on Aid Effectiveness by over 100 signatories from partner governments, bilateral and multilateral donor agencies, regional development banks, and international agencies. The primary intention of the HLF 3 is to take stock and review the progress made in implementing the Paris Declaration, also broaden and deepen the dialogue on aid effectiveness by giving ample space and voice to partner countries and newer actors (such as Civil Society Organsations and emerging donors). It is also a forward-looking event which will identify the action needed and bottlenecks to overcome in order to make progress in improving aid effectiveness for 2010 and beyond. The HLF 3 will be organised as a three-tier structure: * The Marketplace, which will provide an opportunity for a wide range of actors to showcase good and innovative practices and lessons from promoting aid effectiveness; * Roundtable meetings, which will provide an opportunity for in-depth discussion on selected key issues to facilitate and support decision taking and policy endorsement on aid effectiveness; and * Ministerial-Level Meeting, which is expected to conclude the HLF 3 with an endorsement of a ministerial statement based on high-level discussions and negotiation around key issues.

Related items:

DFID’s Independent Advisory Committee on Development Impact (IACDI)

8th March 2008 Minutes of the New Independent Advisory Committee on Development Impact (Contactable via ev-dept@dfid.gov.uk. PS: The committee will have its own website and contact email address by mid-2008)

On independence of evaluation at DFID

(excerpt) ….Comments made by committee members in the ensuing discussion were
as follows:

  • There was a case for DFID’s Evaluation Department (EvD) taking on responsibility for oversight, quality assurance and guidance on self-evaluations in DFID. At present this function seemed fragmented and largely left to the judgement of line managers. This would require extra resources for EvD. In any event, self-evaluation in the department needed to be greatly strengthened.
  • Independent evaluation was dependent upon good quality information on the ground, and this also would be helped with a stronger culture of monitoring and self-evaluation throughout the department.
  • The committee needed to appreciate the realities of a government department in considering independence issues though that did not mean that IACDI could not play a very significant role in protecting and enhancing evaluation independence in DFID
  • An increase in DFID budgets meant an increasing need for information on effectiveness. There seemed a contradiction between the need for more, in-depth evaluation and a declining administrative budget for evaluations. In any event protecting independence suggested a need to explore ways to protect the budget for evaluation.
  • There were concerns about the current reporting arrangements for the Head of EvD which do not conform to internationally recognised criteria. A number of options existed including a direct reporting line to the PS or a DG or the creation of a new post of DG for Audit and Evaluation to which the head of EvD would report. Independence could be further buttressed by further developing the relationship between the Head of EvD and IACDI (e.g. with respect to employment, removal and performance assessment of the EvD Head) and displaying this reporting relationship as a ‘dotted line’ on the organization chart.
  • There were also concerns around the status and grade of the Head of EvD, given the need for the need for the post to have greater visibility and carry greater clout.
  • Some felt that, ideally, a head of evaluation should have a contract precluding employment elsewhere in DFID, and that an advantage of upgrading the post was that it could attract good candidates towards the end of their careers for whom this would not be an issue. Any contract of this kind should be for a fixed term, (either of 10 years or more or renewable on the advice of IACDI). Not all however took the view that future employment within DFID should be precluded and all stressed that the rights of the current incumbent should be protected.
  • Whatever option were chosen, it was felt that written job descriptions, protocols and arrangements for performance review, with a role for IACDI or its chair, could also usefully buttress the independence of the Head of EVD.
  • There was a need to explore further the modalities for (and control of the head of EVD over) staffing: over time EVD may need to change the balance towards an increased role for EVD staff and a lesser role for external consultants.
  • There was also a need for clear written protocols for unimpeded access to information in DFID; for rules of engagement with DFID staff in discussing draft reports; for avoiding staff conflicts of interest; and a written policy on disclosure of reports.
  • The need for a clear and agreed departmental policy on evaluation, to meet internationally recognised criteria and be reviewed by IACDI, was highlighted by committee members, recognising that work on this was already underway.

On Country Program Evaluations.

(excerpt) … Views expressed by the committee were as follows:

  • It was recognised that different types of CPEs are required in different contexts (eg.,fragile states, smaller country programmes).
  • CPEs should aim to provide reliable evidence of impact, particularly on poverty reduction. It was pointed out, however, that it would take a much greater effort and better monitoring and baseline data to get at impact.
  • Some questioned the value of CPEs in relation to the amount spent on them. Others recognised that they held country Directors to account and that they were valued by DFID senior management. The annual CPE synthesis by EvD generated useful lessons and identified themes.
  • Concerns were raised about the apparently low priority given to CPEs by DFID country teams. It was important that they recognised the importance of evaluation and that data collection and monitoring arrangements should be built in from the outset of programmes.
  • Over time there would be advantage in country teams being made responsible for most CPEs as a regular component of country policy management so long as there were adequate incentives set by senior management and adequate quality assurance and oversight, probably best provided by EvD. A transitional arrangement might entail an approach which includes part funding of CPEs by country offices.
  • EvD might then consider complementing this self evaluation effort by carrying out more in-depth independent evaluations in a few countries; and continuing to produce annual syntheses of all CPEs carried out in DFID, with the possibility of including the results of similar work carried out other donors.
  • Such changes would probably take time, however, suggesting that at least for the next year, EvD should be prepared to proceed with its currently planned programme of CPEs.
  • DFID should explore amending ToRs and reporting arrangements for CPEs to make it clear that partner governments and in some cases other donors will also benefit from the evaluations.
  • EvD should in any event carry out a methodology review for CPEs.

On the Evaluation Work Programme

(excerpt) … The views of committee members were as follows:

  • They were concerned at the declining administration budget for evaluation at a time when the overall programme spend was rising fast and use of the programme budget for evaluation was constrained.
  • A surprisingly large sum of programme funds was being earmarked for support of impact evaluations including capacity building and international systems. This was questioned by some. Others were supportive of the approach recognising that the work entailed partner governments leading the processes.
  • It was noted that nothing was being done by EvD on project evaluation though the Committee acknowledged that other parts of DFID were continuing to commission and undertake these. In future, it would be of interest to use the flexible funding line to find out what was going on at the project level. It would also be useful to evaluate smaller items with potentially greater impact.
%d bloggers like this: