Intended for healthcare professionals

Feature Statistics Behind the Headlines

Are you 45% more likely to die in a UK hospital rather than a US hospital?

BMJ 2013; 347 doi: https://doi.org/10.1136/bmj.f5775 (Published 24 September 2013) Cite this as: BMJ 2013;347:f5775
  1. David Spiegelhalter, Winton professor for the public understanding of risk
  1. 1University of Cambridge, Cambridge, UK
  1. d.spiegelhalter{at}statslab.cam.ac.uk

David Spiegelhalter is frustrated by the recent headlines that English patients are more likely to die in hospital than US citizens

On 11 September Channel 4 News carried lengthy and uncritical coverage of work by Brian Jarman comparing hospital mortality in seven Western countries between 2004 and 2012. The headline claims were that English “health service patients are 45% more likely to die in hospital than in the US,”1 which was the leading (and only named) country of the seven being compared.

This was followed by newspaper coverage including claims that “A patient in England was five times as likely to die of pneumonia and twice as likely to die of septicaemia compared to similar patients in the US.”2

The basis of these claims was questioned on Twitter and in online articles, and blogs, particularly as neither the data nor the methods were publicly available—it is perhaps notable that the BBC’s website did not cover the story at all. To his credit, Jarman responded with a torrent of robust tweets and provided links to files with some limited details of the methods and results.3

Health systems differ

Nevertheless, it is frustratingly difficult to assess the evidence for his conclusions because of lack of information about the data, methods, hospitals, and even countries involved. Jarman seems to have pooled routinely collected, individual level data on hospital discharges from the seven countries, used in-hospital mortality as an outcome, and fitted a common prediction model using age, sex, emergency or elective admission, comorbidity score and diagnosis using the Agency for Healthcare Research and Quality clinical classification system, which is based on ICD-9 codes.4 This enabled him to calculate an expected mortality risk for a hospital’s admissions and so obtain a hospital standardised mortality ratio (HSMR) for each hospital.

Criticism has been focused on the comparison with the US. Indeed, in the document provided to Channel 4 News,5 Jarman acknowledges that the US “has lower life expectancy and higher infant mortality rates” than the UK and “there is a disincentive for poorer people to be admitted” (it is notable that this international HSMR, unlike the UK version, does not adjust for deprivation). In addition, the comparability of coding can be questioned because of the known practice of “up-coding” in the US to increase reimbursement6 and possibly different use of terms such as pneumonia and sepsis.

There also seem to be wide international differences in discharge policies before death—a recent study estimated that 78% of deaths in Japan occur in hospital, compared with 56% in England and Wales, 45% in the US, and 34% in the Netherlands.7 This will have an important effect on in-hospital mortality—other countries may rapidly move patients into intermediate care facilities, an option that is not readily available in the UK. Jarman himself observed in 2004 that “In-hospital death rates are 4.9% in the US compared with 9.3% in England, suggesting that people are more likely to die out of hospital in the United States”8—a similar finding to his current analysis but with a rather different emphasis. Given that there is also general scepticism about the HSMR methodology, my personal inclination is to take little notice of the overall comparison with the US.

Over-reliance on the media

Jarman is clearly passionate about improving the NHS and has been frustrated at the lack of interest that has been taken in his analyses over the years. This has led him to a personal crusade, sidelining the usual routes of scientific papers and worthy reports by committees and to make direct contacts with the media. However, just as with the previous Telegraph story about “13 000 needless deaths” (currently subject to an investigation by the Press Complaints Commission after it received two complaints, one from me),9 he seems to trust the media to report his caveats. They almost invariably fail to do so.

We are entering the era of “big data” and, although you can’t help but be impressed at anyone who does logistic regressions with 21 000 000 observations, this is a fine example of when size is not everything—rather, we need data that are fit for the purpose of comparing what would happen to a similar patient were they admitted to different hospitals worldwide.

And unless big also means open data, it is impossible for outside observers to verify the analysis and interpretation, especially when the stories are trumpeted by media with an apparent vested interest in running down the NHS. This inevitably breeds suspicion and scepticism.

Of course, it would be deluded to deny there are serious problems in aspects of the NHS, or that we could not learn from good evidence of improved outcomes for comparable patients, and the culture of safety that exists in the best hospitals in the US and elsewhere. Channel 4 featured the Mayo hospital in Arizona as an example, and ventures such as Risky Business have been pressing these issues for years.10 The recent Berwick report into patient safety proclaimed that “The NHS in England can become the safest health care system in the world.”11 But that this would “require unified will, optimism, investment, and change.” If it takes this kind of publicity to bring these issues to increased prominence and contribute towards a cultural shift, then we should not complain.

But this was an exercise in closed data, and I remain sceptical about the specific statistical claims.

Notes

Cite this as: BMJ 2013;347:f5775

Footnotes

  • Competing interests: I have read and understood the BMJ policy on declaration of interests and have no relevant interests to declare.

  • Provenance and peer review: Commissioned; not externally peer reviewed.

References

View Abstract