Patricia Rogers, in her recent post entitled Does the recent evaluation show that Head Start doesn’t work? asks the question:
Why do these reports summarize the findings inaccurately? Is it deliberate misrespresentation or error? Is it just too hard to include variations in the results in a brief summary, or are the reporters not sufficiently skilled? Or do the reporters judge that the results are not broad enough across the domains, or not large enough?
I’d like to add to this question/discussion with a few observations about the media and its reporting of not just evaluation findings, but even just simple evaluative concepts.
Here in New Zealand, the government has moved to introduce National Standards for literacy and numeracy in primary schools. These are basically descriptions of what children should be able do/read/write/understand in order to be able to adequately access the rest of the curriculum at that level. Teachers are to use a range of appropriate assessment tools (as they presumably do already) to gauge where each child is at, and – this is the new piece – are required to clearly report to parents/families whether their child is performing at, above, below, or well below the National Standards in literacy and numeracy.
Just last week (at the time this post was originally written, in Feb 2010), New Zealand’s most watched news program (TVOne 6pm news) ran an item that began by telling us how terribly confused parents were about the new National Standards – implying, of course, that they were going enlighten everyone and help them understand the basics. Right …take a look at this 3-minute news item: Controversial new national standards introduced.
The National Standards are a criterion-referenced approach, i.e., performance (in literacy and numeracy) is evaluated against – as the title implies - “standards” (i.e., where the child NEEDS to be).
How does TVOne news explain them? As a norm-referenced approach. The reporter states that the Standards will tell parents whether their child is at, above, or below the “national average” (i.e., where the child is relative to other peers). WRONG!!
The same news program has repeatedly said that children are going to be “ranked” - illustrating exactly the same total lack of understanding a of very very fundamental evaluation concept.
Most lay people can grasp the difference between grading/rating and ranking, so what’s wrong with the media?
My own observations lead me to believe that reporters and those controlling the quality of mainstream news media reports are, for the most part, woefully lacking in critical thinking ability, have a completely inadequate grasp of even the most basic evaluation concepts, and seem to have no sense of responsibility about getting a sound understanding before going to air and ‘educating’ the public.
The reporting of actual evaluation findings is a bit more complicated, as Patricia explains in her post. My hunch is that much of the blatantly inaccurate reporting we see is a combination of:
- ignorance of key evaluative concepts,
- a lack of critical thinking ability,
- laziness (e.g. just reporting what others have said instead of actually reading the original sources),
- cherry-picking the newsworthy (read: ratings-boosting) snippets at the expense of a more complete representation, and
- pressures to dumb down the news into tabloid-sized snippets that the public don’t have to work too hard to take in.
Of course, I’m mostly referring to ‘mainstream’ TV here; the print media and some radio stations often do a much better job (and thank goodness someone does), but IMHO the bulk of it is still well short of good enough, genuine reporting. And, of course, I’m not watching media from all over the world (mostly New Zealand, but we get some Australian, U.S. and English channels here) – how do your local media outlets compare?
I suppose the big question for me is, what can we DO about this? Here are a few ideas that have crossed my mind, but I’d very much like to hear others’ views:
- Teach critical thinking in schools and in higher education.
- Teach some basic evaluative thinking too – it’s a life skill; it can save you a fortunate as you go through life making decisions about what to purchase, invest in, etc
- Help evaluators turn the nub of their findings into easy-to-grasp sound bites that the media could pick up and use as they are – maybe all evaluation reports should come with a set of press releases, some very short, some a little longer?
- As citizens and consumers of the news, keep criticizing the dumbed down news that is fed to the masses.
- Perhaps professional evaluation associations should consider commenting publicly on the misrepresentation of evaluative concepts and evaluation findings.