Following the Francis Inquiry, we need to look at the roles of Hospital Standardised Mortality Rates (HSMR) and Summary Hospital-level Mortality Indicators (SHMI). It may seem obvious – given that Mid Staffordshire NHS Foundation Trust had a persistently high HSMR and poor quality of care – that the debate is settled around whether or not hospital mortality rate is a good measure of quality. It would be easy to draw the wrong conclusion.
The mainstream media talks of precise numbers of avoidable deaths as if the expected mortality calculated by the model was a fact, which is clearly not the case. The are actually careful to but, unfortunately, producing a specific number leads others to infer precision and validity. Whilst generalising from a specific instance is dangerous, we can use Mid Staffordshire to illustrate some strengths and weaknesses.
We should start by saying that, had Mid Staffordshire chosen to take the high HSMR (or indeed other information that was available) as an alert and evaluated quality of care, it is likely that action to improve care would have been taken earlier. Clearly then the HSMR would have made a valuable contribution. This is similar to the role that Early Warning Scores (EWS) have in clinical practice: they warn us that there may be problems that are not otherwise apparent.
When developing alerts, sensitivity and specificity matter. To add value they need to err on the side of caution. So every patient with a high EWS will need to be assessed but not all will need intervention, and not every patient who is sick will be identified. A nurse who is concerned about a patient would not fail to ask for help if the score was low.
Similarly with HSMR, high values should be investigated but are not proof of poor quality of care. Just as importantly, ‘normal’ or ‘low’ values must never be taken as proof that care is satisfactory or good. This would be true of any monitoring system, but particularly so when considering hospital-wide mortality, where .
Mid Staffordshire was also one example where changes in coding markedly improved risk adjusted mortality, but not necessarily quality – a frequent, unintended consequence of public reporting and implicit or explicit target setting. It’s one reason why, when reporting changes in HSMR, it is essential to also report both the observed and expected values.
An unwelcome recent development is for trusts to respond to reports of poor care by quoting their ‘good’ HSMR or SHMI as evidence of good quality. The thinking behind this is just as flawed as assuming poor care from a high value. It appears that part of the problem at Mid Staffordshire was that the board assumed all was well, and then sought evidence to support that view.
It would be a tragedy if others made the same mistake by over-emphasising HSMR rather than ignoring it. HSMR or SHMI should be used to ask questions but not to provide answers. They may serve as a smoke alarm, they should never be a smoke screen.
Following the publication of the second Francis report, the Department of Health decided to investigate quality of care in 15 other trusts with high HSMR or SHMI. We can be confident that the team charged with this will find scope for improvement in them all, if only because that is true everywhere. Whether they will be those most in need is doubtful, but the risk is that this is how they will be portrayed.
It has been said that ‘All models are flawed, some are useful’. We should add that a model that is useful for one purpose might not be for another. Hospital wide mortality models may be useful to sound an alarm, but are not suitable for making either negative or positive judgments about quality of care whether between institutions or over time.
Measuring safety is important and frustratingly difficult but that does not justify over-simplification. It requires a measurement framework: a single measure is a fantasy.
Simon is a Quality Improvement Fellow currently spending a year the IHI in Boston, and is Divisional Medical Director, NHS Lothian. www..com/SimonJMackenzie
You might also like...
Our briefing suggests there's potential to make better use of quality measurement to improve quality of care.
Medicine is OnlyWan has announced that it is creating a network of data specialists from across the UK.
Average annual costs per patient for the top 5% of patients are over 20 times higher than all other patients
October 2019 chart of the month. Our anaysis explores for the first time the distribution of both primary and secondary healt...
'Immigrants should not be blamed for pressures in the NHS. The reasons – unsurprisingly – are far more complex.' E…
Work with us
We look for talented and passionate individuals as everyone at the OnlyWan has an important role to play.View current vacancies
The Q community
Q is an initiative connecting people with improvement expertise across the UK.Find out more