Tweet The power of evaluative thinking.
Dr. Jane Davidson and educational systems change expert Joanne McEachen explain how sometimes the obvious solution is not in fact the right solution at all.
Even if it’s part of what’s needed, there are systemic issues in play that must be addressed as well.
Powerful insights to share with
Read the whole post –> Through the Looking Glass (with evaluative thinking!): How professional learning solutions implicitly blame teachers
In the medical profession in particular, there are some very rigid beliefs about what constitutes good enough “evidence of effectiveness” to justify offering, recommending, allowing patients to try, or even just not vehemently opposing a particular type of treatment for a patient. There are some glimmers of hope in other sectors (e.g. in the Best Evidence Synthesis work here in New Zealand). But there are still three areas where there are very serious challenges in building a credible evidence base given the kinds of constraints and realities surrounding them. They are: (1) cutting-edge treatments; (2) treatments that are by their very nature tailored/individualized rather than standardized across patients or populations; and (3) learning what works for small sub-populations
Read the whole post –> What constitutes “evidence”? Implications for cutting-edge, tailored treatments, and small sub-populations
The new funding rules for the US Department of Education’s $650 million Investing in Innovation appear based on an out-of-date model of evidence-based policy and hierarchy of evidence. Recent developments in our understanding of evidence-based policy would suggest changes are needed to the selection criteria and to how successful proposals will be evaluated.
Read the whole post –> Investing In Innovation – a need to apply what we know about evidence-based policy
Most lay people can grasp the difference between grading/rating and ranking, so what’s wrong with the media? Following on from Patricia Rogers’ recent posts about the misreporting of evaluation findings, this post looks at an example from the New Zealand media (reporting on the new National Standards for literacy and numeracy) of leading the public astray with a complete lack of understanding of this very fundamental evaluation concept. Jane also ponders the reasons why the mainstream media in particular gets this kind of thing wrong so often …
Read the whole post –> The media and evaluation reporting – clueless or unscrupulous?
Another Head Start evaluation, another controversy about whether the results show it works or not. In her comment on our post on the NY School Milk Study Susan Wolf drew our attention to some important differences between the recent evaluation report on Head Start, and how it was represented in an email from the
Read the whole post –> Does the recent evaluation show that Head Start doesn’t work?