Evaluation Revisited – Improving the Quality of Evaluative Practice by Embracing Complexity

Utrecht Conference Report. Irene Guijt, Jan Brouwers, Cecile Kusters, Ester Prins and Bayaz Zeynalova. March 2011. Available as pdf

This report summarises the outline and outputs of the Conference ‘Evaluation Revisited: Improving the Quality of Evaluative Practice by Embracing Complexity’’, which took place on May 20-21, 2010. It also adds additional insights and observations related to the themes of the conference, which emerged in presentations about the conference at specific events.

Contents (109 pages):

1 What is Contested and What is at Stake
1.1 Trends at Loggerheads
1.2 What is at Stake?
1.3 About the May Conference
1.4 About the Report
2 Four Concepts Central to the Conference
2.1 Rigour
2.2 Values
2.3 Standards
2.4 Complexity
3 Three Questions and Three Strategies for Change
3.1 What does ‘evaluative practice that embraces complexity’ mean in practice?
3.2 Trade-offs and their Consequences
3.3 (Re)legitimise Choice for Complexity
4 The Conference Process in a Nutshell

“The Truth Wears Off” & “More Thoughts on the Decline Effect”

These two articles by Jonah Lehrer in the New Yorker (13/12/2010, 03/01/2011)  provide a salutary reminder of the limitations of experimental methods as a means of research, when undertaken by mere mortals, with their various limitations. They do not, in my reading, dismiss or invalidate the use of experimental methods, but they do highlight the need for a longer term view about the efficacy of experimental methods, and one which situates their use within a social context.

Many thanks to Irene Guijt for bringing these articles to my attention, and others, via the Pelican email list.

“In “The Truth Wears Off,” Lehrer “wanted to explore the human side of the scientific enterprise. My focus was on a troubling phenomenon often referred to as the “decline effect,” which is the tendency of many exciting scientific results to fade over time. This empirical hiccup afflicts fields from pharmacology to evolutionary biology to social psychology. There is no simple explanation for the decline effect, but the article explores several possibilities, from the publication biases of peer-reviewed journals to the “selective reporting” of scientists who sift through data”

In “More Thoughts on the Decline Effect” Lehrer responds to the various emails, tweets and comments posted in reply to his article. Amongts other things, he says “I think the decline effect is an important reminder that we shouldn’t simply reassure ourselves with platitudes about the rigors of replication or the inevitable corrections of peer review. Although we often pretend that experiments settle the truth for us—that we are mere passive observers, dutifully recording the facts—the reality of science is a lot messier. It is an intensely human process, shaped by all of our usual talents, tendencies, and flaws”

There are also many interesting Comments following both posts by Lehrer

%d bloggers like this: