Managing genuine evaluation paradoxes: Genuine reporting

In reponse to the earlier post on genuine evaluation snippets from around the globe, Irene Guijt raised a very important question about the tensions between several hallmarks of genuine evaluation:

Some important contrasts presented but also one that doesn’t entirely align – tell the whole story but cut to the chase? Include activities, outputs and outcomes AND identify the issue of essence/determine what’s important? Um? Are both possible at the same time?

[see Irene’s full response for more detail]

Managing these tensions and apparent paradoxes is, in my view, one of the things that really clearly distinguishes genuine from non-genuine evaluation.

When I used to teach evaluation to doctoral students in WMU’s Interdisciplinary Ph.D. in Evaluation, here’s what I insisted they do in evaluation reports, short homework assignments, essays, dissertations, and presentations (which, to give credit to the original source, was more or less what Michael Scriven used to insist we do when I studied with him at CGU!):

  1. Even when there are numerous details and nuances that need to be reported, it’s still critically important to cut to the chase. This means present the most important points first – it shows you know which they are! The secondary issues come next. [This is the exact opposite of traditional academic writing, where we gradually lead the reader through the maze of details to the overall conclusion.]
  2. On the issue of how much detail is enough/too much, the advice to tell the whole story (but don’t get lost in the details) means: tell enough of the story so that the overall messages aren’t misleading.

I’m also reminded of Eleanor Chelimsky’s advice this week about how to present findings in a way that won’t make the audience glaze over with the level of detail – this is from a post in this week’s riveting Thought Leader’s Forum (from AEA – it’s members only access, but membership is incredibly reasonable and access to these discussions (and much more) make it fantastic value for money!):

… But you had to observe Members of Congress closely.  In theory, for example, the legislators loved what they called “hard data,” but I found that their eyes frequently glazed over during my presentations of them.  I often ended up forced to present a really strong study in the form of anecdotes or case studies, which were, however, not an N of 1, but were representative of the study data and overall findings, and had the advantage of being entirely comprehensible to a congressional audience as well as the media.  I should also mention that there was a secondary penalty for NOT doing that:  not only did our message not get across, but we were seen as technocrats, not a good image for that audience.

For me, it’s taken years of experimentation and practice with different ways to cut to the chase and present key messages that various audiences can comprehend and remember, while at the same time telling enough of the nuanced story so as to make the main messages not overly simplistic or misleading. I’m by no means successful all the time, and I’m always on the lookout for insights and ideas. So, please contribute yours in a comment!

Comments are closed.