Flawed measurement can suggest illusory savings


Today’s Managing Health Care Costs Indicator is 10


I gave a series of talks in 2003 about how poor measurement in the disease management industry generated misleading claims of cost savings.  

Fast forward to this decade, and we see promises from the wellness industry of “returns on investment” appear too good to be true.  If they seem too good to be true, they are probably false.

When employers, health plans, government or others reviewing claims of cost savings, here are a few things they should look for

What’s the comparison group?
We should insist on comparison groups that are really comparable.  It’s useless to compare refuseniks to those who voluntarily participate in an optional program.  Even propensity matching to control for differences between groups can only adjust for known cofounders – like age, gender and job class.  The differences that matter, like readiness to change, are often not adequately adjusted for.

What’s the participation in the intervention?
What if I offered an intervention, no one participated, but the results were good anyway?  Would you give me credit for my intervention?  I don’t think so.  This is a face validity issue- let’s be sure that enough members really participated to have the type of impact claimed by a medical or health management company.

Is it plausible that the intervention would lead to the promised cost savings?
Does it make sense that the intervention would have the impact claimed? Some wellness programs offer no intervention beyond serial health risk assessments, but infer huge claims savings from changes between the first and second HRA administration.  HRAs can be instructive – but they are not likely alone to lead to huge behavior change.


Are all costs of identification, enrollment, and intervention included in the evaluation?
There is a tendency to only include a limited portion of the total costs of a program when evaluating the results.  For a health plan which hires a medical management company, there are substantial interface costs, including data transfer, contracting, and supervision, that are easy to overlook.

Are any substitute claims costs included in the return on investment calculation?
If a hospitalization is averted through the appropriate use of home services, it’s important to look at the net savings, removing the incremental costs of home care. 


I suggested in 2003 that there were ten ways to “cook the books” and make an intervention seem more effective than it really was.  They were:
  1. Overstate the savings
  2. Understate the costs
  3. Report only some of the data
  4. Ignore effects of timing
  5. Mistake gross for net savings
  6. Inappropriately extrapolate from experience
  7. Ignore risk of failure
  8. Claim credit for savings realized by other parties
  9. Overstate the inflation factor
  10. Tell a story that is too good to be true

These observations appear as appropriate in 2011 as they did almost a decade ago.