This week I was talking with a colleague who is currently reviewing draft evaluation reports. She has been trying to explain to the authors that a useful evaluation report is not just about reporting a few indicators, nor a data dump item by item from a questionnaire, but a coherent set of answers to key evaluation questions, with evidence supporting the conclusions and judgments.
It was helpful to be able to refer her to two of Jane Davidson’s publications on this issue, including her great presentation from the 2009 AEA conference,:
Improving evaluation questions and answers: Getting actionable answers for real-world decision makers
Jane Davidson on ensuring evaluations actually ask evaluative questions and give clear, evaluative answers AEA 2009 conference demonstration session, downloadable from AEA e-library
Unlearning some of our social scientist habits
Jane Davidson on how academic training in the social sciences can impede genuine evaluation Downloadable: Journal of Multidisciplinary Evaluation, 4(8), 2007, pp. iii-vi.
Jane will be presenting a one day workshop on this and more in her ‘Actionable evaluation tools and methodologies’ pre-conference workshop before the Australasian Evaluation Society conference in Adelaide on Tuesday 28th August 2012. (If you hurry, you can still get early bird registration – closes Monday 25th June).
This workshop will provide you with a big-picture framework and a useful mixed-method evaluative tool – rubrics – that can help you ask the right questions and generate clear, direct answers that are both well-reasoned and well-evidenced.
You will learn:
- How to write a set of big-picture overarching questions to guide the evaluation;
- How to develop and use rubrics to get clear, defensible answers to these questions;
- How to respond appropriately to simplistic indicator-based thinking and accusations of subjectivity;
- Tips for evaluation conceptualization and reporting to help you deliver clear, to-the-point, and actionable evaluation.
This workshop combines interactive mini lectures and small group exercises to build big picture thinking to focus evaluation on what really matters, and the “nuts and bolts” concepts and tools needed to deliver actionable answers. This is not a research methods workshop. This workshop is designed for less seasoned or even beginner evaluators who may or may not be familiar with applied research methods (not a pre-requisite) but who have the niggling feeling there’s “something more” they need in their toolkits, something simple but not simplistic.
The workshop will also provide a chance to ask questions about applying the concepts and methods outlined in Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation (Sage), and hear some of Jane’s more recent thinking on these issues.
9 hot tips for commissioning, managing and doing actionable evaluation. – Jane Davidson.