IFAD’s independent evaluation ratings database

(found via IFAD posting on Xceval)

[from the IFAD website] “The Independent Office of Evaluation of IFAD (IOE) is making publicly available all the ratings on the performance of IFAD-supported operations evaluated since 2002.  As such, IOE joins the few development organizations that currently make such data available to the public at large. The broader aim of disclosing such evaluation data is to further strengthen organizational accountability and transparency (in line with IFAD’s Disclosure and Evaluation Policies), as well as enable others interested (including researches and academics) to conduct their own analysis based on IOE data.

All evaluation ratings may be seen in the Excel database. At the moment, the database contains ratings of 170 projects evaluated by IOE. These ratings also provide the foundation for preparing IOE’s flagship report, the Annual Report on Results and Impact of IFAD operations (ARRI).

As in the past, IOE will continue to update the database annually by including ratings from new independent evaluations conducted each year based on the methodology captured in the IFAD Evaluation Manual. It might be useful to underline that IOE uses a six-point rating scale (where 6 is the highest score and 1 the lowest) to assess the performance of IFAD-funded operations across a series of internationally recognised evaluation criteria (e.g., relevance, effectiveness, efficiency, rural poverty impact, sustainability, gender, and others).

Moreover, in 2006, IOE’s project evaluation ratings criteria were harmonized with those of IFAD’s operations, to ensure greater consistency between independent and self-evaluation data (Agreement between PMD and IOE on the Harmonization of Self-Evaluation and Independent Evaluation Systems of IFAD). The Harmonization agreement was further enhanced in 2011, following the Peer Review of IFAD’s Office of Evaluation and Evaluation Function. The aforementioned agreements also allow to determine any ‘disconnect’ in the reporting of project performance respectively by IOE and IFAD management.”

The Katine Challenge: How to analyse 540+ stories about a rural development project

The Guardian & Barclays funded and AMREF implemented, Katine Community Partnerships Project in Soroti District, Uganda is exceptional in some respects and all too common in others.

It is exceptional in the degree to which its progress has been very publicly monitored since it began in October 2007. Not only have all project documents been made publicly available via the dedicated Guardian Katine website, but resident and visiting journalists have posted more than 540 stories about the  people, the place and the project. These stories provide an invaluable in-depth and dynamic picture of what has been happening in Katine, unparalleled by anything else I have seen in any other development aid project.

On the flip side, the project is all too common in the kind of design and implementation problems that have been experienced, along with its fair share of unpredictable and very influential external events, including dramatic turn-arounds in various government policies. Plus the usual share of staffing and contracting problems.

Right now the project has completed its third year of operation and is now heading into the fourth and final year, one more year than originally planned.

I have a major concern. It is during this final year that there will be more knowledge about the project available than ever before, but at the same time its donors, and perhaps various staff within AMREF, will be becoming more interested in other new events appearing over the horizon. For example, the Guardian will cease its intensive journalistic coverage of the project from this month, and attention is now focusing on their new international development website

So, I would like to pose an important challenge to all the visitors to the Monitoring and Evaluation NEWS website, and the associated MandE NEWS email list:

How can the 540+ stories be put to good use? Is there some form of analysis that could be made of their contents, that would help AMREF, the Guardian, Barclays, the people of Katine, and all of us learn more from the Katine project?

In order to help I have uploaded an Excel file listing all the stories since December 2008, with working hypertext links. I will try to progressively extend this list back to the start of the project in late 2007. This list includes copies of all progress reports, review and planning documents that  AMREF has given the Guardian to be uploaded onto their website.

If you have any questions or comments please post them below, as Comments to this posting, in the first instance.

What would be useful in the first instance is ideas about plans or strategies for analysing the data. Then volunteers to actually implement one or more of these plans.

PS: My understanding is that the data is by definition already in the public domain, and therefore anyone could make use of it. However, that use should be fair and not for profit. What we should be searching for here are lessons or truths in some form that could be seen as having wider applicability, which are based on sound argument and good evidence, as much as is possible.

Launch of online database of research accountability tools

Announcement: 7 September: launch of online database of research accountability tools

The One World Trust, with support from the IDRC, has created an interactive, online database of tools to help organisations conducting policy relevant research become more accountable.

Processes of innovation and research are fundamental to improvements in quality of life and to creating a better society. But to realise these benefits, the quality of research alone is not enough. Organisations engaged in policy-relevant research and innovation must continually take into account and balance the needs of a diverse set of stakeholders: from the intended research users, to their clients and donors, to the research community and the research participants.  Responsiveness to all of these is crucial if they are to be legitimate and effective. In this, accountable processes are as important as high quality research products.

The Trust has built the online accountability database to support researchers, campaigners and research managers to think through the way they use evidence to influence policy in an accountable way. The database takes into account that research organisations are increasingly diverse – they are no longer just  universities, but private companies, public institutes and non-profit think-tanks. No single framework can encompass this diversity.

Instead, the database provides an inventory of over two hundred tools, standards and processes within a broad, overarching accountability framework. With a dynamic interface and several search functions, it allows users to identify aspects of accountability that interests them, and provides ideas to improve their accountability in this context. Each tool is supported by sources and further reading.

We also encourage engagement with and discussion on the database content, through allowing users to comment on individual tools, or to submit their own tools, processes and standards for inclusion.

The database is an output of a three-year project, titled “Accountability Principles for Research Organisations.” Working with partners across the globe, the project has generated an accountability framework which is sufficiently flexible to apply to many contexts and different organisations.

The database will be available online from the 7 September.

For more information about the project please feel free to contact us at bwhitty@oneworldtrust.org. For the database, please visit www.oneworldtrust.org/apro

%d bloggers like this: