I have spent the beginning of the week as Steve Aos’s chaperon, a most agreeable task. His hectic schedule, typically American, is partly my responsibility so I feel the least I can do is help him get from A to B.
Cost-benefit analysis on prevention programs has got itself a bad reputation. Good but small studies undertaken many years ago are the basis for claims that a single dollar (it is generally US evidence) will not only benefit children but also save tens of dollars further down the line.
At first acquaintance the argument is persuasive, but the closer one looks at the original calculations the less convincing they become.
But Aos has revolutionized the field. He has been analyzing the cost-benefit of interventions to improve child outcomes for a decade, and the evidence he produces becomes more compelling with time, not less.
The standard of evidence is set high. Much higher than in the UK. But at the Cabinet Office seminar where he spoke on Monday there were clear indications that we are catching up, witness the experimental evaluations of Nurse Family Partnership. And there are real indications from politicians and senior policy makers that the gap will be closed.
How disappointing then that another of Steve Aos’s presentations was part of a conference organized by UK Government to promote its excellent work on programs like Multidimensional Treatment Foster Care (MTFC) and Functional Family Therapy but which in fact advertised poor evidence.
Both programs are being evaluated by experiment. They work in the US and they should work in the UK. But we don’t know. And we have to find out. The funded randomized controlled trials will tell us.
But delegate packs at the Government funded conference contained reports from audits that demonstrate the progress of children enrolled in the MTFC intervention group. Impressive, but has it been more or less impressive than the control group? That’s what we need to know, and presumably at some point we’ll find out. But we don’t know yet.
It’s understandable that Government should be keen to get its hands on audit data on the progress of children in the intervention group. But reporting it in isolation is confusing to a policy and practice audience still learning about the primacy – in an outcome-orientated world – of experimental evaluations, and just beginning to recognize how misleading results from studies without a control group can be.