This is an outline for a presentation I recently deliverd.
Common Pitfalls in Monitoring and Evaluation
Issues to Consider when you are the implementer / commissioner of evaluations
Introduction: What people Think of Evaluations
Often people are very scared of evaluations because of previous experiences, lack of experience or a general misconception regarding evaluations.
Introduction: Why must we measure?
Although there is growing consensus that we need to measure the results (outputs, outcomes and impacts) of our projects / programmes / policies, there is still much confusion about exactly why we are doing it.
Two main purposes of evaluations:
-- Accountability to various stakeholders
--Learning to improve the projects / programmes / policies
The projects / progammes / policies we implement affect thousands of people and if we get it wrong thousands will be affected negatively (or not affected at all)
We often complain about the cost of measuring our impact, but have we considered the costs of not measuring our impact?
Introduction: We want to evaluate BUT…
Once we are convinced that we should be measuring our impacts, a range of other questions come up:
--How should it be evaluated?
--When should it be evaluated?
--How will we know that the impact is the best possible?
--How do we know if it is our programme that made those differences?
--Can we do our own evaluation or should we get some specialist to do it?
--If there were simple one-size fits all answers to these questions, evaluation would probably have been much more appealing than it is today.
Common Pitfalls in Evaluation 1
Failing to clarify the intended use or the intended users of the evaluation – Producing "Door Stops".
Thinking you can evaluate your impact after year one of an intervention in a complex system – Expecting too much.
Thinking your impact evaluation is only something you need to worry about at the end of the project – Waiting too long.
Measuring every detail of a programme thinking that it will allow you to get to the big picture "impact" – Measuring too much.
Doing the wrong type of evaluation for the phase in which the project is in – Method / timing match.
Common Pitfalls in Evaluation 2
Allocating too little time and resources to the evaluation – More is better.
Allocating too much time and resources to the evaluation - Less is more.
Sticking to your or someone else’s "template" only – One size does not fit all.
Thinking that an online M&E system will solve all of your problems – Computers don’t solve everything.
Not planning for how the evaluation findings will be used – Findings don’t speak for themselves.
Common Pitfalls in Evaluation 3
Running a lottery when you are supposed to receive tenders for doing the evaluation – Lottery evaluations
Sending the evaluation team in to open Pandora’s box – Don’t do evaluation if you need Organisational Development.
Doing an impact evaluation without taking into consideration the possible influence of other initiatives / factors in the environment – Attribution Error.
Doing an impact evaluation without looking what the unintended consequences of the project was – Tunnel Vision
Ignoring the voices of the "evaluated" – Disempowering people
Common Pitfalls in Evaluation 4
Expecting your content specialist to also be an evaluation specialist and vice-versa – Pseudo Specialists lead to pseudo knowledge
Doing evaluations, creating expectations and then ignoring the results
Do not report statistics like level of significance and effect size when you incorporate a quantitative aspect to your evaluation – Being afraid of the "hard stuff"
Do not acknowledge the lenses you are using to analyse your qualitative data – Being colour blind
Getting hung up on the debate about whether quantitative / qualitative methods are better – Method Madness
How to address the pitfalls
Given that until very recently there were no academic programmes focusing on training people in evaluation, it is important that we find ways of improving our understanding of the field.
You need not be an evaluation specialist to be involved with evaluation.
Make sure that the evaluators you work with have development as an ultimate goal.
How to address the pitfalls
Resources for helping you to do / commission better evaluations
Join an association: For example the South African Monitoring and Evaluation Association (http://www.samea.org.za/) or the African Evaluation Association (http://www.afrea.org/)
Take cognisance of the guidelines and standards produced by these organisations
Make use of the many online resources available on the topic of evaluation (Check out Resources on the SAMEA web page)