Independent Commission for Aid Impact publishes report on “How DFID Learns”

Terms of Reference for the review

The review itself, available here, published 4th April 2014

Selected quotes:

“Overall Assessment: Amber-Red: DFID has allocated at least £1.2 billion for research, evaluation and personnel development (2011-15). It generates considerable volumes of information, much of which, such as funded research, is publicly available. DFID itself is less good at using it and building on experience so as to turn learning into action. DFID does not clearly identify how its investment in learning links to its performance and delivering better impact. DFID has the potential to be excellent at organisational learning if its best practices become common. DFID staff learn well as individuals. They are highly motivated and DFID provides opportunities and resources for them to learn. DFID is not yet, however, managing all the elements that contribute to how it learns as a single, integrated system. DFID does not review the costs, benefits and impact of learning. Insufficient priority is placed on learning during implementation. The emphasis on results can lead to a bias to the positive. Learning from both success and failure should be systematically encouraged”.

RD Comment: The measurement of organisational learning is no easy matter, so it is likely that a lot of people would be very interested to know more about the ICAI approach. The ICAI report does define learning, as follows:

“We define learning as the extent to which DFID gains and uses knowledge  to influence its policy, strategy, plans and actions. This includes  knowledge from both its own work and that of others. Our report makes a distinction between the knowledge  DFID collects and how it is actively applied, which we term as ‘know-how’.”

Okay, and how is this assessed in practice? The key word in this definition is “influence”. Influencing is a notoriously difficult process and outcome to measure. Unfortunately the ICAI report does not provide an explanation of influence was assessed or measured. Annex 5 does show how the topic of learning was broken down into four areas:  making programme choices; creating theories of change;  choosing delivery mechanisms; and adapting and improving implementation of its activities. The report also provides some information on the sources used: “The 31 ICAI reports  considered by the team examined 140 DFID programmes across 40 countries/territories, including visits undertaken to 24 DFID country offices”….” We spoke to 92 individuals, of whom 87 were DFID staff from:  11 DFID fragile state country offices;  5 non-fragile small country offices;  16 HQ departments; and  13 advisory cadres” But how influence was measured remains unclear. ICAI could do better at modeling good practice here: i.e. transparency of evaluation methods. Perhaps then DFID could learn from how ICAI about how to assess its (DFIDs) own learning, in the future. Maybe…

Other quotes

 “DFID is always losing and gaining knowledge. Staff are continuously leaving and joining DFID  (sometimes referred to as ‘churn’). Fragile states are particularly vulnerable to high staff turnover by UK-based staff. For instance, in Afghanistan, DFID informed us that staff turnover is at a rate of 50% per year. We are aware of one project in the Democratic Republic of Congo having had five managers in five years. DFID inform us that a staff appointment typically lasts slightly under three years.” A table that follows show an overall rate of around 10% per year

 “DFID does not track or report on the overall impact of evaluations .The challenge of synthesising, disseminating and using knowledge from an increasing number of evaluation reports is considerable. DFID reports what evaluations are undertaken and it comments on their quality. The annual evaluation report also provides some summary findings. We would have expected DFID also to report the impact that evaluations have on what it does and what it  achieves. Such reporting would cover actions  taken in response to individual evaluations and their impact on DFID’s overall value for money and effectiveness.” It is the case that some agencies do systematcially track what happens to  the recommendations made in evaluation reports.

“DFID has, however, outsourced much of its knowledge production. Of the £1.5 billion for knowledge generation and learning, it has committed at least £1.2 billion to fund others outside DFID to produce knowledge it can use (specifically research, evaluation and PEAKS). Staff are now primarily consumers of knowledge products rather than producers of knowledge itself. We note that there are risks to this model; staff may not have the practical experience that allows them wisely to use this knowledge to make programming decisions.”

“We note that annual and project completion reviews are resources that are not fully supporting DFID’s learning. We are concerned that the lesson-learning section was removed from the  standard format of these reports and is no longer required. Lessons from these reports are not being systematically collated and that there is no central resource regularly quality assuring reviews. “

RD Comment: Paras 2.50 to 2.52 are entertaining. A UK Gov model is presented of how people learn, DFID staff are interviewed about how they think they learn, then differences between the model and what staff report are ascribed to staff lack of understanding: – “This indicates that DFID staff do not consciously  and sufficiently use the experience of their work for learning. It also indicates, within DFID, an over-identification of learning with formal training” OR… maybe it indicates that the the model was wrong and the staff were right???

This para might also raise a smile or two: “There is evidence that DFID staff are sometimes using evidence selectively. It appears this is often driven by managers requiring support for decisions. While such selective use of evidence is not the usual practice across the department, it appears to be occurring with sufficient regularity to be a concern. It is clearly unacceptable.” Golly…

Comments?

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: