To submit information about conferences please use the Comment facility below or email firstname.lastname@example.org
To post details of training events, please go to “M&E Training”
To post details about job vacancies please go to “M&E Jobs”
What this conference is about
This conference seeks to get clarity and learn about how to measure what matters, when we are aiming for SDGs yet we seem to live in a ‘post-truth’ society.
“Partnership and collaboration across every sector and at every level is vital if we are to meet the 2030 Global Goals for Sustainable Development. We need to find ways to measure progress in ways that have meaning to individuals from local to global, and across every sector. The range of organizations and stakeholders present and the range of initiatives being developed show how the Goals can be used to develop a shared framework.” – Jessica Fries, Executive Chairman, A4S. In: Measure What Matters: a Framework for Action. Post-Event Summary. 2016
How do we get to the core of what matters to be evaluated? And for who? How can we generate evidence that has meaning for society at large, not just the key players in society?
But how do we contribute to this as evaluators, program officers, policy makers, when facts seem to become less influential than emotions and personal beliefs? According to the Oxford Dictionaries ‘post-truth’ is defined as “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief”. We can see what happened in the UK (Brexit) and USA (Trump). Evaluators are not the only ones that advocate for evidence-based decision making: 'An American government that ignores science to pursue ideological agendas endangers the world'(Common Dreams,25 January 2017; https://tinyurl.com/hvsqtum)
This conference is about measuring what matters in a post-truth society.
Speakers: Claire Hutchings, Head of Programme Quality, Oxfam GB (https://tinyurl.com/h94ysza) and
Wendy Asbeek, Ministry of Foreign Affairs, Director Policy and Operations Evaluation ( IOB)
For more information go to the conference web page: http://www.managingforimpact.org/event/conference-measuring-what-matters-%E2%80%98post-truth%E2%80%99-society
Facing Forward: Innovation, Action, and Reflection http://c2017.evaluationcanada.ca/
The 2017 CES Conference will focus on innovation, action, and reflection. We will explore the latest methods and approaches evaluators are applying in a wide range of contexts and consider the ways in which evaluators are incorporating reflection into practice. This theme is meant to engage both internal and external evaluators in exploring the different ways in which they are thinking about their role and how they conduct evaluations. We will reflect on our work, explore ideas, share experiences and learn from one another about how to meet the new and emerging demands evaluators are facing in an era of social media, economic challenges, systems thinking, globalization and other emerging trends in our society.
Wed10May2017Thu11May2017London, United Kingdom
Theme: The use and usability of evaluation: Demonstrating and improving the usefulness of evaluation
The theme of the 2017 Annual Evaluation Conference is Exploring the current uses of evaluation. Evaluation is a common term in the English language and means different things to different people. It is used in many different ways. We need to ensure that institutions and their staff who wish to “evaluate” their policies, programmes, projects and institutions know what that really means and what they will gain from the exercise. How, for example, does it differ from an audit, or monitoring, or a review? Is it the same as research? Can they use evaluation to measure impact, or success? Is it conducted before, during or after taking action, is it an external or internal exercise, who is involved, how are its conclusions communicated?
Emphases on different types of use of evaluation change over time. These have included focus on accountability, establishing impact, formative and developmental evaluation to improve implementation, and generation of evidence to learn about what works.
Organised by Visionary Analytics and Ministry of Finance
Details to be provided
Information on the previous evaluation conferences in Lithuania is provided here: http://esinvesticijos.lt/en/events/evaluationconference
Tue19Sep2017Wed20Sep2017Kingsbury Hotel, Colombo, Sri Lanka
The Sri Lanka Evaluation Association (SLEvA) is a voluntary civil society organisation established in 1999, by an interested group of professionals from the government, private sector and non-governmental organisations. One of the flagship events conducted by SLEvA is the Biennial International Conference in Sri Lanka which attracts both local and foreign presentations and participation.
This year’s Conference theme is Actioning Evaluation - Key to Sustainable Development (SD).
For more information visit the conference website: https://slevaconference2015.wordpress.com/
120/10, "Vidya Mandiraya"
Wijerama Mawatha, Colombo 07
Tel : +9411 2696235
Wed27Sep2017Fri29Sep2017Chicago, Illinois, USA
Heightened community unrest sparked by the death of unarmed citizens; disproportionate inequities in education, poverty, health care, and rates of incarceration; and an intensely divisive U.S. presidential election require even more vigilant attention from our global CREA community. It is critically important that we focus on the generation, analysis, and usage of substantive evidence “that matters” in the evaluations and assessments we undertake. To address the issues our communities face, we are compelled and responsible to raise questions about what is being done to correct inequities and aggressively translate this evidence into action that has meaningful impact on our collective future.
Therefore the Evidence Matters: Culturally Responsive Evaluation and Assessment Translating to Action and Impact in Challenging Times will focus on the following areas:
Program evaluation, measurement and assessment as sources of evidence
Challenging the status quo regarding whose evidence matters
Cultural responsiveness as foundational to more equitable public policy
Moving from evidence generation to advocacy and action
Policies and practices of influence and consequence in the quest for social justice
Ethical challenges in complex areas of inquiry; whose justice is advanced?
Conference flyer 2017 CREA Conference Flier
Conference website: http://crea.education.illinois.edu/home/crea-conference-2017
Theme: Evaluation + Design
Everything we evaluate is designed. Every evaluation we conduct is designed. Every report, graph, or figure we present is designed. In our profession, design and evaluation are woven together to support the same purpose—making the world a better place. By considering both as parts of a whole, we can advance that purpose.
This year, we will consider the integration of design and evaluation in three areas.
Program Design: We call the intentional actions that organizations take to improve the social or natural world programs. Today,organizations of every kind are designing and implementing programs. They include:
- nonprofit organizations serving the most critical needs of communities with education, human services, and health programs;
- government agencies piloting innovative solutions to longstanding social problems;
- philanthropists supporting collaborative programs designed to create collective impact;
- corporations implementing social responsibility programs that promote social equity and environmental sustainability;
- and public-private initiatives, such as pay-for-success funding, social enterprises, and impact investing, that leverage private capital for social good.
Every program is designed, yet the field of evaluation has not developed a systematic approach to designing programs. Should we? Can we? What would it look like? What role should evaluators play? How can evaluation be built into a program from the start? Can we design for sustainability? What does it mean for a program to have an exemplary design?
Evaluation Design: An evaluation design integrates evaluation theories, approaches, and methods to achieve a set of intended purposes in a specific context. An evaluation design encompasses more than research design. It also includes:
- the use of evaluation as a direct means of creating change, for example by working with stakeholders in ways that promote equity and empower communities;
- strategies for understanding, working with, and building consensus among stakeholders;
- the thoughtful combination of quantitative and qualitative methods;
- the application of evaluation for formative, summative, and developmental purposes;
- and the development of evaluative criteria that establish what matters, to whom, and why.
Evaluation design is dynamic, changing as programs develop and real-world challenges emerge. Given this complexity, what constitutes an exemplary evaluation design? Does flexibility and responsiveness come at the expense of quality, credibility, or usefulness? Can we learn to design evaluations that are faster, cheaper, and better? How do we balance the competing interests of stakeholders?
Information Design: Evaluators must communicate to diverse audiences about the complexities of programs and their impacts. They strive to be accurate, compelling, and clear. At the same time, they are constrained by the time, attention, and training of audiences. To be successful, evaluators must develop strong information design skills, such as:
- data visualization techniques that transform mind-boggling complexity into clear, meaningful images;
- real-time data displays that help managers make decisions better and more quickly;
- storytelling that speaks to the part of our brains hardwired by evolution to learn from narrative;
- and multi-faceted communication strategies that leverage social psychology and social media to promote appropriate and timely use.
Information design plays a central role in evaluation. But what is good information design? How should it be taught and learned? Can we be persuasive and accurate at the same time? Have evaluation reports become obsolete? What roles can online and interactive technologies play?
In 2016, professiona