9 golden rules for commissioning a waste-of-money evaluation

Here in New Zealand we’re nearing the end of the central government financial year, with the usual utterly ridiculous frenzy of trying to jam just one more thing into this year’s budget before it gets sucked down a black hole never to be seen again. So, it seemed timely to come back to the notion of value for money – not of programs or policies, but of evaluations themselves.

There may well be some managers out there who have yet to experience the thrill of commissioning a total waste-of-money evaluation. Well, thankfully, your colleagues around the world have been  working diligently on building a knowledge base just for you …

9 golden rules for commissioning a waste-of-money evaluation:

1. Be vague about what kind of evaluation you need There’s only one kind, isn’t there?
2. Be vague about the budget It’s just the same as shopping around for widgets or fleet vehicles – price competition works exactly the same way.

And besides, evaluators are all mindreaders, so will easily figure out whether you want a Rolls Royce or a scooter.

3. Use an onerous RFP process Save on proposal reading time by narrowing the field down to large evaluation shops with minions who can copy and paste boilerplate and spend loads of time on a glossy 20+ page proposal.
4. Hire researchers to do an evaluator’s job Oh, there’s a difference?
5.  Opt for subject matter expertise over evaluation and cultural expertise After all, evaluation’s just a content-specific measurement exercise, isn’t it?

Besides, there’s no such thing as “evaluation expertise”

And why would we need “cultural expertise” anyway? This program might be for a diverse community, but it’s not one of those “cultural” programs!

6. Assume that “cultural expertise” is sufficient when the only brown faces on the team are the research assistants Any whipper-snapper worth their salt can challenge a senior white evaluator and get them to completely reconceptualize a fundamentally flawed design.

Not that all that stuff matters anyway – it’s really just a matter of user-friendly faces to do the data collection and translation work.

7.  When budget is tight, opt for the contractor with the lower daily rate After all, you can get more days/hours out of them, right?

Don’t be put off by pesky details, e.g. that an experienced evaluator working at twice the daily rate can get the job done 4 times as fast and to a level of quality that the less experienced person/team can never come close to no matter how much time they put in.

8. Sign the contract, then hope for the best until the final report arrives You don’t want to be breathing down the evaluator’s neck.

And besides, they know best what you need, for whom, when, why, and in what form.

9. Bury and forget about disappointing evaluations No point dwelling on a bad memory; just file it away and commission another one!

Want to suggest a gem or two to add to the list? The comments box below awaits! Can we try for one from each region/continent here, just to ensure we are getting a global view?

32 comments to 9 golden rules for commissioning a waste-of-money evaluation

  • Dugan Fraser

    Hire an economist because all the really interesting issues can be counted and anything that can’t really doesn’t matter. Plus economists specialise in telling you what things cost, and that’s all you’re really interested in anyway.

    [Africa]

  • I LOVE THIS. Here are few additions:

    Make sure your program/org design and strategies are lacking in focus and intent.

    Assume that the “evaluators” can do magic and the responsibility for demonstrating impact is theirs.

    We should do another list from the evaluator perspective!

    [North America]

  • Bianca Montrosse

    Ha ha…this is great Jane! Here’s a few more I have encountered…

    Evaluators don’t really need to be commissioned until shortly before program funding ends. (What information could they possible contribute that would be useful until the end anyway?)

    Related…

    Evaluations don’t really need to be commissioned until the granting agency asks for an evaluation report. (It’s just another hoop to jump through for the funding agency anyway.)

    And, my personal favorite…

    All evaluations, no matter the size and scope, somehow magically cost between 10-15% of the total budget. (At least that is what I read somewhere.)

    [North America]

  • Carolyn Sullins

    #10. “Eye for an eye: If the evaluation is critical of your program, don’t try to learn from it. Instead, write a scathing note to the evaluator’s supervisor about his/her alleged incompetence.

    [North America]

  • Patricia Rogers

    And this one:

    “Guess the number in my head” – test out the competence of the evaluator by not sharing data, information that data exist, previous evaluations, concurrent evaluations, planned policy changes, forthcoming personnel changes, the dates of critical decisions and meetings, or what you actually want the evaluation to achieve. If the evaluator is any good, they’ll be able to figure all of this out by themselves.

    [Oceania]

  • How about this one:

    Expect the evaluator to develop and cost an evaluation approach and design based only on the RFP documentation (because of course all the important information needed for design is there). And then, hold them to this contractually even when the program and the context turns out to be completely different to what the RFP documentation suggested. After all, evaluation always proceeds just like in the proposal, right? And evaluators are able to nimbly adapt and absorb the cost overruns, and besides, it was the evaluator who proposed the approach in the first place?

    [Oceania]

  • Anna Douglas

    Hire evaluators with a pre-fab evaluation plan, even it was designed for organizations that have little in common with your own organization. It doesn’t really matter if the evaluators “get” what your organization does…they’ll still be able to evaluate it, right?

    Also, don’t share the results with any of the employees or stakeholders. Let’s just say we had an evaluation done and hope nobody sees the damned thing.

    [North America]

  • Jane Davidson

    We are loving the suggestions – keep them coming!

    Re: our goal for global participation from each continent/region, we are still waiting to hear from:
    * Central & South America
    * Europe
    * Asia

  • Some of my recent annoyances:

    Refuse to pay for a literature review. Who says science is a cumulative enterprise?

    Predict what data collection processes will be required, specify them in the RFP and then insist on the evaluator collecting data that way.

    Commission an evaluation, don’t use it, then commission the very same evaluation a couple of years later, don’t use it, then commission the very same evaluation a couple of years later, don’t use it, then . . .

    [Africa]

  • Tom Grayson

    Great stuff! Here is another addition. Be sure to use data that is not actionable. Example: The evaluative question is: How might we reduce the number of underage undergrade drinking violation on campus? Simply report that there are 12 campus taverns/bars. Do not report that the data indicates that 3 taverns/bars on our campus account for 97% of all underage undergraduate student drinking. The latter information would only be used to target enforcement efforts.

    [North America]

  • A good evaluator is a dead evaluator

    Iran [West Asia]

  • Laura Tagle

    Ask for on going evaluations and ask for evaluation reports at an early stage in program implementation–then complain you have not received any evidence of impacts

    Make sure you ask for a wholesale evaluation of an entire multi-sector program operating nationally or at least in a large region.

    Terms of reference never need to be changed: the ones from 1996 are perfectly fine for 2010–no matter what has happened to the programs, the national or EU evaluation policy, the evaluation market, the world.

    Do have your evaluator write the terms of reference for next evaluation

    Make sure your evaluation contracts Last longer than your marriage–if forced to interrupt the current one, make sure you remarry, oops re-hire, the same company

    large generalist companies doing technical assistance beat small companies and individuals any day–make sure your barriers to entry stay high by requiring huge financial capacity and previous experience in exactly that type of program

    Never check whether the big name on the proposals have the actual time to do the work they promise–ignore the fact that underpaid and unsupervised juniors are trying to do the work
    (Europe)

  • Finally one from Europe:

    Never give any follow up to the evaluation. You had already decided what you were gonna do anyway.

  • Irene Guijt

    Commission the evaluation to start on the first day of office opening and first day of work for staff members to ensure truly in-depth stories of impact are heard.

    Ask for a formative evaluation and when giving feedback on the draft evaluation report change your mind and say ‘actually, summative is what we’re really after so would you mind doing a slight edit and oh, yes, you might need other data as well’. Above all blame the evaluator for this oversight.

    Commission an evaluation to start in South Africa during the first World Cup week.

    Reassure the lead evaluator that she/he will have lots of in-house support but leave out the part about one being on paternity leave, another just transferred elsewhere, and the third member of the ‘sounding board and support team’ working part-time and winding down by end of the month.

    [Europe]

  • Thomaz Chianca

    One from Brazil: “The Indicators’ Church”

    Always starting any evaluation by defining which indicators to use and the best instruments to measure them… And never really getting to the hard thinking on what you definitely need to know about the evaluand to determine its quality, value and importance, i.e., the big picture evaluation questions; and also what parameters (values) will be used to determine if a performance was exceptional, good, mediocre, weak or unacceptably bad.

    [South America]

  • Let me add an important cost-saving measure for an evaluation: provide the preliminary findings you want to the evaluators on the first day. This really cuts down on costs by reducing the data collection investment! You can apply this important cost-cutting measure especially if you have done a good job implementing golden rule number 7, since when you have an evaluator with less experience, they may actually be grateful you are making their job easier.

    [North America]

  • Laura Tagle

    May I complicate rule #7 from my European
    perspective? We are actually suggesting that hiring a more junior person directly and explicitely (i.e., crediting the evaluation to her name once it is done) may be more conducive to good evaluation than hiring a big shot or a company. This is because, in our evaluation market, the actual work is almost invariably done by underpaid junior staff who never get invited to meetings, consistently receive very little or no support from senior evaluators, and constantly remain invisible–thus not accumulating any cv-boosting, call-for-tender-demonstrable experience. We hope that bringing these people to the fore, together with activating Steering Groups and other forms of support, may improve their work conditions and their performance, and activate reputational mechanisms

    [Europe]

  • Anna Douglas

    These are so funny! It is interesting to realize some of my frustrations are not specific to my culture; that these are happening around the world. Thanks, Jane, for asking for global participation.

    [North America]

  • Annabel Brown

    # Management getting you down?:

    If a program has problems or there is some conflict and some management decisions need to be made… don’t manage them properly… instead commission a ‘REVIEW’…

    Make the scope, purpose and questions vague, don’t tell the reviewer about the problems or conflict… and just let them go ahead and exacerbate it!

    It’s great fun and guarantees you don’t have to manage properly or do anything with the review results. If the problems bubble up again… just commission another review!

    [Oceania]

  • Stephen Porter

    Perhaps we need to ask how we perpetuate these rules?

    Reading these reminds me of Cassius’ challenge to Brutus:

    Why, man, he doth bestride the narrow world
    Like a Colossus, and we petty men
    Walk under his huge legs and peep about
    To find ourselves dishonourable graves.
    Men at some time are masters of their fates:
    The fault, dear Brutus, is not in our stars,
    But in ourselves, that we are underlings.

    The ill-incentives of states often induces waste-of-money evaluations. Perhaps there is work to be done in: Challenging this demand for evaluation, developing a range of clients who care what evaluation they demand, and to take more time when training and mentoring on specifying good evaluation.

    Any other ideas on solutions?

    RSA – Africa

  • Mathea Roorda

    Ensure there is an academic on the team / Rationale: you want a credible evaluation don’t you? Evaluators aren’t ‘real’ researchers, after all…

  • “TASTE THE SOUP FIRST” before you publish the report… if you need to re-fudge don’t hesitate…!
    After all, as my Iranian colleage Mohamed Hassan put eloquently “A good evaluator is a dead evaluator”..

  • This one from Sudan:
    “We know definitely there is something wrong in this project, but let’s make it clear to you: it can’t be due to the national islamization policy or ideology of the country. Let’s find out who’s the person responsible for sabotaging the project.”

    It’s been happening since the current regime of President Bashir took power in 1989!

  • We’ve built on your point #3 about onerous RFP processes. As a small shop, we’re tired of giving away our ideas for free, maybe people with academic appointments can do this (probably they should, they’re earning a salary after all) but we’re outta there, as they say. Take a look:

    http://www.usablellc.net/an-alternative-to-the-evaluation-rfp-process#more-578

  • Patricia Rogers

    Thanks, Eric, for this guidance. I’ve been recently teaching a course on Evaluation for Public Sector Managers in the Australia and New Zealand School of Government, and this is an additional resource I will refer them to. I’m particularly interested in your position on stating a budget range. There are still some people who believe that giving an indication of budget breaches probity rules, because they don’t understand that buying an evaluation service is not like buying a standardized product, and that if you don’t give a budget indication you are likely to get wildly varying budgets, wasting everyone’s time.

  • Dean Whitehead

    “Let the games begin!”…early…

    Before raising the temperature in the kitchen, assemble your ingredients and tools. Don’t jump to an RFP. Instead, develop an RFI to request information on how evaluators or shops would approach your project. Next, issue an RFQ to request the qualifications of the competitors. Compile all responses and reconfigure as the mother of all RFPs, then beat them down on the price!

    Who says government work can’t be fun?

  • Having completed three “tour of duties” through two different Canadian govnerment sectors* my experience with ‘commissioning a waste-of-money evaluation:’ don’t have a professionally trained evaluator in the room at any point of discussion of the project or better yet, not even in the building. The best situation is to have the professional evaluator out of town…too cynical?

    *my apologies to those truly dedicated and talented civil servants…I once was one…a blog in itself or, perhaps, a fringe play?

  • Joining late in this hilarious posting, let me try this one:

    >> give the evaluation team only 1 week to answer all evaluation questions copy-pasted from OECD-DAC Criteria

    [SouthEast Asia]

  • Tarina MacDonald

    Lol…fellow contributors….my little offering is:

    Put an RFP out for a longitudinal study of say 2-3 years and only 1 month till RFP closing date….lol lol…

    and counting the many more to respond…

  • Can’t believe I’m just seeing this for the first time.. incredible! “It’s not one of those ‘cultural’ programs, oh my goodness…

    Addition: Make sure to blame the evaluator/methods when results are less-than-ideal, because there’s no way your pilot program was anything less than perfect.

  • Just tell the evaluator “we need a survey” that’s my fave! :)

    If that’s all you need, then why am I here again?

    I’m just seeing this too, nice revamp of the site, highlighting the most popular posts!

  • Diz McKinnon

    Implement an ethics process that is at odds with your RFQ, then blame the evaluator for not being able to clear the ethics process and/or adhere to your stated methodology.

    Ask for a frank and fearless report while with-holding crucial information and refusing to provide figures.

    Insist that the evaluator team includes a person of your choice with no relevant experience or intention to comply with acceptable ethical standards.