Article Text

National statutory reporting: not even ticking the boxes? The quality of ‘Learning from Deaths’ reporting in quality accounts within the NHS in England 2017–2020
  1. Zoe Brummell1,
  2. Dorit Braun2,
  3. Zainab Hussein1,
  4. S Ramani Moonesinghe1,
  5. Cecilia Vindrola-Padros1
  1. 1Department of Targeted Intervention, University College London, London, UK
  2. 2Advisor/Lived experience, London, UK
  1. Correspondence to Dr Zoe Brummell; z.brummell{at}nhs.net

Abstract

Introduction Regulation through statutory reporting is used in healthcare internationally to improve accountability, quality of care and patient safety. Since 2017, within the National Health Service (NHS) in England, NHS Secondary Care Trusts (NSCTs) are legally required to report annually both quantitative and qualitative information related to patient deaths within their care within their publicly available Quality Accounts as part of a countrywide patient safety programme: The Learning from Deaths (LfDs) programme.

Method All LfDs reports published between 2017 (programme inception) and 2020 were reviewed and evaluated through a critical realist lens, quantitatively reported using descriptive statistics and qualitatively using reflexive thematic analysis.

Results In 2017/2018, 44% of NSCTs reported all six statutory elements of the LfDs reporting regulations, in 2019/2020 35% of NSCTs were reporting this information. A small number of NSCTs did not report any parts of the LfDs regulatory requirements between 2017 and 2020. Multiple qualitative themes arose from this study suggesting problematic engagement with the LfDs programme, erroneous reporting accuracy and errors in written communication.

Conclusions The LfDs programme has, to some extent, reduced variation and improved consistency to the way that NSCTs identify, report and investigate deaths. However, 3 years into the LfDs programme, the majority of NSCTs are not reporting as required by law. This makes the validity of National statutory reporting in Quality Accounts within the NHS in England questionable as a regulatory process.

  • Health policy
  • Hospital medicine
  • Patient safety
  • Qualitative research
  • Quality improvement

Data availability statement

Data are available in a public, open access repository. All data used in this study are publicly available from NHS Secondary Care Trusts Quality Accounts. The dataset for this research came from NHS Trust Quality Accounts (QA), which are publicly available documents, found on NHS Trust websites and referenced within the paper itself. In some instances it may be necessary to write to the NHS trust to ask for the QA where it can not be found (or is no longer available) on the trust website.I have provided links for NHS Trusts Quality Accounts in online supplementl information. As far as we are aware, there are no conditions of reuse.

http://creativecommons.org/licenses/by-nc/4.0/

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

WHAT IS ALREADY KNOWN ON THIS TOPIC

  • Health policies and patient safety programmes with statutory reporting requirements are used in theory to drive improvements in the National Health Service (NHS). Whether or not statutory reporting occurs is relatively unknown.

WHAT THIS STUDY ADDS

  • Statutory reporting of the Learning from Deaths programme by NHS Secondary Care Trusts (NSCTs) within the publicly available Quality Accounts is variable and inconsistent due to NSCTs difficulties with engagement.

HOW THIS STUDY MIGHT AFFECT RESEARCH, PRACTICE OR POLICY

  • Further research is needed to assess the value of statutory reporting in Quality Accounts as a regulatory mechanism.

Introduction

Statutory public reporting through the mandatory submission of both financial and non-financial information is used in healthcare internationally as a form of regulation, in an attempt to improve accountability and quality of care.1 2 Within the National Health Service (NHS) in England, public reporting to improve quality has been a component of a National Quality Programme for nearly two decades.3 There is mixed evidence whether publicly reporting quality and performance information improves outcomes for patients.4

The ‘Quality Account’(QA) is an annual report, publicly available on NHS Secondary Care Trusts (NSCTs) websites and submitted to the Secretary of State for Health and Social Care by end June each year; publication is mandated by law.5 6 QAs provide data about the quality of NSCTs services. The QA requires that at least three priorities for improvement are set and reviewed by individual NSCTs, for example, one NSCT chose to focus on improving staff experience. Other QA requirements are statutory.7 All NSCTs, acute, specialist, mental health, community and ambulance services have the same requirements for inclusion in their QA. The Kings Fund (2011) reviewed QAs compliance with statutory requirements shortly after guidance introduction, they found significant variation between NSCTs quality measures and the format of presentation. They recommended policy makers support organisations to ensure ‘clarity and consistency in the presentation of quality indicators’.8

Current legislation requires all NSCTs to annually report both quantitative and qualitative information related to patient deaths within their care, as part of a new programme—Learning from Deaths (LfDs).9 10 LfDs arose in response to independent reviews of individual NSCTs which demonstrated a lack of systematic approach in the ways in which NSCTs become aware of, investigate and share LfDs and a lack of meaningful change occurring in response to unexpected deaths.11 12 The NHS National Quality Board (a forum of senior clinicians from key national NHS oversight organisations, including the Care Quality Commission (CQC)) published ‘national guidance on LfDs’, describing mortality governance (structures and processes to ensure accountability and transparency), engagement with bereaved families and the requirement of LfDs reporting in QAs.10 The LfDs programme was not piloted, the logic model was not clear and evaluation was not built into the programme.

Lalani and Hogan (2021) 13 in their narrative account of the key drivers in the development of the LfDs programme noted that some NSCTs have taken on the LfDs programme as a ‘tick-box’ exercise. This study, therefore, analyses LfDs reports to understand whether NSCT reporting in their QAs matches statutory requirements (with the hypothesis that they are not) and to review the quality of reporting. A qualitative analysis seeks to understand any difficulties around reporting. This study does not analyse the ‘learning’ or ‘actions’ or ‘assessment of the impact of the actions’ arising through the NSCT reviews, this will be discussed in a separate paper.

Method

This is a mixed-methods study evaluating the quality of statutory reporting for an NHS improvement programme. The analysis is undertaken using a critical realist lens.14 Critical realism is a model of how social systems work, through understanding the three different layers of social reality (real, actual and empirical) and how events observed or perceived impact on actions (or no action) and how this is influenced by social structures. Critical realism can be used to explain what influences organisational action.15

We undertook a secondary source document analysis using quantitative counts and descriptive statistics.16 For qualitative analysis, we used thematic analysis (TA) of all LfDs reports in QAs from NSCTs between 2017 and 2020.17–680 Ambulance trusts were excluded as they were not required to report in 2017/2018.681

Reflexive TA as described by Braun and Clarke (2021) has been used for its flexible, yet structured approach to find ‘shared meaning’ in the data while enabling researcher consideration of their own impact on data interpretation.682 Reflexive TA in this study enables evolution of and contextualisation of the causal mechanisms affecting the ability of NSCTs to report LfDs. The authors have purposefully used quantitative and qualitative approaches, despite the tensions arising, to provide a fuller explanation of NSCTs empirical reporting, to engage with front-line staff and policy makers (who embrace quantitative research) and to reveal underlying deeper causal mechanisms negatively impacting LfDs reporting.

We undertook analysis of LfDs as set out in the 2017 amendment to the NHS 2010 QA regulations: this involved review of compliance of reports against regulation numbers 27.1–27.6 (table 1).9 Where NSCTs did not fully report we sought to understand why this may have been the case from comments within the QA itself. In addition, we reviewed LfDs reports for basic errors in quality (such as grammatical and typographical mistakes). Only data found within the NSCTs QA was included in the analysis. Quantitative analysis was undertaken and reported using descriptive statistics. The qualitative methodology used was a process of data familiarisation (reading each report twice on separate occasions), systematic data coding, generating initial domains deductively,683 then developing multidimensional themes inductively through active engagement and immersion in the data, looking both at what was present and what was absent. Finally, these themes were refined by sense checking with the original reports and discussion with the other authors.

Table 1

NHS quality accounts LfDs regulations9

Every LfDs report was reviewed by the primary reviewer (ZB) twice, on separate occasions to ensure full data capture. Ten per cent of reports from 2018/2019 to 2019/2020 were identified by random number generation and reviewed independently by a second reviewer (ZH) to ensure accuracy and reliability. In the case of disagreement, ZB rereviewed the LfDs report for clarification. Reflexive TA was predominantly undertaken by ZB, with input from the other authors at the final stage.

Data were captured in Microsoft excel (V.16.15). This study has been reported using Standards for Reporting Qualitative Research.684

Reflexivity

Reflexivity was undertaken alongside the research methodology, as described by Trainor and Bundon (2021).685 For additional information, see online supplemental file 2.

Supplemental material

Patient and public involvement

This study forms part of a larger programme of work overseen by a public and relatives steering group to improve relevance from the perspective of those affected by deaths in healthcare and to reduce healthcare researcher bias. The steering group have been involved in the planning, design, development of conclusions and paper authorship. The reporting of patient and public involvement has been undertaken using guidance for reporting involvement of patients and the public 2—short form.686

Results

Quantitative analysis

The number of NSCTs is reducing year on year due to mergers of NSCTs: 222 NSCTs in 2017/2018, 217 NSCTs in 2018/2019, 213 NSCTs in 2019/2020.

NSCTs who reported all six statutory elements (table 1) of the LfDs reporting framework:

  • 2017/2018: 98 out of 222 (44%)

  • 2018/2019: 109 out of 217 (50%)

  • 2019/2020: 75 out of 213 (35%)

NSCT reporting data for 2017 - 2020 against the six statutory elements can be found in table 2.17-680

Table 2

NHS Secondary Care Trusts (NSCT) LfDs statutory reporting 2017–2020 17-680

NSCTs who did not report any parts of the LfDs regulatory requirements:

  • 2017/2018: Two NSCTs (0.9%)

  • 2018/2019: Three NSCTs (1.4%)

  • 2019/2020: Eleven NSCTs (5.2%) (nine QA published, but no LfDs report, two no QA published)

Some NSCTs omitted providing any figure (including zero) for deaths judged to be more likely than not to have been due to problems in care (also excluding NSCTs who did not provide an LfDs report at all).

  • 2017/2018: 20 NSCTs did not report

  • 2018/2019: 18 NSCTs did not report

  • 2019/2020; 19 NSCTs did not report

Reasons given for not reporting included:

  • ‘Data collection challenges’.

  • ‘Unable to provide a reliable figure’.

  • ‘We do not carry out investigations with a view to determining whether the death was wholly or partly due to problems in the care provided’.

  • ‘Currently no research base on this for mental health services and no consistent accepted basis for calculating this data’.

Reasons for not reporting are discussed further within the qualitative reflexive TA.

Qualitative reflexive TA

‘This programme isn’t for us’

It was clear from the reports particularly in 2017/2018, that some NSCTs did not believe the guidance was written for or applied to them. The similarity with some of the wording in the LfDs reports could indicate collaboration between NSCTs about how to approach elements of the regulation deemed more controversial, particularly reporting the number of death more likely than not due to problems in care. This was especially the case for Mental Health and Community NSCTs: ‘there was no nationally agreed definition for mental health services or community health services with regard to what constitutes a death from problems in care and therefore this data was not reported’.

A small number of NSCTs reported defensively, for example: ‘Data presented within this section is for learning within the Trust and is not comparable with any other trusts (Acute, Community and Mental Health) published data, it should not be used to provide organisational benchmarking or presented as comparators in any onwards reports’.

Another less common occurrence was wording demonstrating annoyance: ‘NHS England did not make any central training resource available to roll out training on how the mortality review tool should be implemented by mental health trusts across the country and to spread learning from it both locally and nationally’.

One NSCT did report in 2017/2018 and 2018/2019, but subsequently have not provided an LfDs report in 2019/2020 (or in 2020/2021) despite commenting in their QA that they have a ‘Detailed LfDs programme in place’.

Some NSCTs did report a death or deaths due to problems in care, but do not report any learning or actions as a result of this; none explain why this is the case. For example: One NSCT reported 11 deaths more likely than not due to problems in care, however, in the lessons learnt and actions sections they have stated ‘N/A’.

The statutory requirement to report the number of deaths due to problems in care appears to be one of the most difficult for NSCTs to report on. As can be seen from the quantitative data, many NSCTs reported zero deaths or did not report the number of deaths due to problems in care. One NSCT describes being ‘unable to provide a reliable figure for the number of deaths in the reporting period which were judged more likely than not to have been due to problems in the care’ because the methodology they are using does ‘not allow the calculation of whether a death has a greater than 50% probability of being avoidable’ and add to this that this methodology provided by the Royal College of Physicians does not ‘endorse the comparison of data’ between NSCTs.

‘A partial truth’

A very small number of NSCTs make approximations in their LfDs reports related to deaths, one NSCT described that the ‘mortality review process identified ≤5 deaths that were judged to be more likely than not to have been due to problems in the care provided to the patient.’ Another NSCT does not give an exact number of total deaths, but instead states that ‘around 3000 inpatients die in our hospitals every year’.

One NSCT describes no deaths due to problems in care, but also note receipt of a Prevention of Future Death (PFD) report from the coroner within the same QA. The coroner issued a PFD report in describing problems in care where actions could be taken.687

Another NSCT reporting zero deaths due to problems in care in their 2018/2019 LfDs report also report within the same QA that there were 23 patients who came to severe harm or death due to patient safety incidents in 2018/2019. While it is possible that all these incidents caused severe harm (and not death) this is not stated.

‘Not speaking the same language’

Reporting is heterogeneous. NSCTs have reviewed and investigated with varying thoroughness and with differing methodologies. Some NSCTs have not reported data for the same time periods (eg, reporting annual data from January to December instead of April to April). Some NSCTs reported incomplete data for total number of deaths and/or reviews/investigations (eg, only reporting for 9 months of the year). NSCTs have assessed deaths in scope for inclusion variably, with variable inclusion and exclusion of Emergency Department patients, children, stillbirths and outpatients.

Reviewers and investigators are also not standardised and may not always be suitably qualified to undertake a review: ‘Investigation of patient’s complaint of pain in his neck should have been investigated earlier, although it is beyond the experience of the reviewer as to how common this complaint is after tracheostomy’.

Other elements of reporting are also difficult to compare, for example, what one NSCT thinks is assessment of impact is not what another considers to be the case. Some NSCTs have focused heavily on the process around LfDs for example which patients are in scope for review, but provide very little detail about the incident.

‘Errors in written communication’

Errors in written communication are a frequent reoccurrence across a large number of NSCTs LfDs reports. Many of these mistakes in isolation could be considered minor, but together account for a significant number of mistakes in a public facing document about people who have died.

Some of these errors could be deemed careless typographical and/or mathematical errors, for example: misplacing a decimal point, thus completely change the scale of the data. Another relatively common error is not providing a heading or title for the LfDs report, or for tables or graphs. Or mistakenly writing the wrong financial year within the report. Or duplication of information. Another occasional problem is contradictory information in the same report.

Occasionally sentences within the LfDs reports are difficult to interpret or don’t make sense, due to grammatical errors: in one NSCT LfDs report they state: ‘In 2018–2019 specialty mortality and morbidity meetings the quality of case discussions has been improved through the additional collective judgement of the overall quality of care using the NCEPOD grading tool’.

Discussion

This study demonstrates a significant lack of LfDs reporting in QAs between 2017 and 2020 and a signal that many NSCTs did not fully understand the statutory requirements or did not think they applied to them (‘This programme isn’t for us’). The overall reporting for all six statutory elements of the LfDs reporting framework (table 1) had reduced from 44% in 2017/2018 to 35% by 2019/2020. This reduction is in contrast to expectations 3 years after the introduction of the programme.688 It is possible that the COVID-19 pandemic, affecting NSCTs by March 2020, could account for this, but this seems unlikely to be the whole story since the QA reporting year ended in April 2020. When we previously reported on the quality of 2017/2018 LfDs reporting, we noted that reporting variation may be due to differences in interpretation of the guidance and statutory requirements; however, 3 years should be sufficient time to get to grips with reporting requirements.683 The incentive to report appears clear: to demonstrate quality of care and safety for patients; however, the penalty for not reporting is less clear, there is no direct financial penalty for an NSCT not reporting LfDs. The only sanction facing NSCTs is that they may be notified to take action following CQC inspections.689 And the evidence that external inspections actually improve NSCT performance is unclear.690

There remains wide variation in the total number of deaths reported by NSCTs. This may in part be accounted for by the different types of NSCT, where specialist NSCTs will have fewer deaths and community NSCTs could have thousands of deaths if outpatients are included. However, the wide variation in the number of case record reviews or investigations undertaken relative to the number of patient deaths in individual NSCTs, is less easily explained and was a concern raised by the CQC in 2016.12 There has been improvement in the standardisation of NSCT case note review methodology, with the majority of NSCTs (particularly acute NSCTs) using Structured Judgement Review (SJR) methodology by 2020.691 Mental Health NSCTs were, however, less likely to be using the SJR methodology and overall appeared to be less engaged with the LfDs programme. The SJR methodology is a standardised method introduced by the Royal College of Physicians in 2016 to review and describe the quality of care received by patients who have died in NSCTs.691

Some NSCTs (between 11% and 14% depending on the year of analysis) did not report any lessons learnt from deaths; some reported zero deaths more likely than not due to problems in care and, therefore, were not strictly required to report lessons learnt. There were also a small number of NSCTs who reported deaths more likely than not due to problems in care but did not go on to report lessons learnt. As one example, an acute NSCT reported 11 deaths due to problems in care (2017/2018), but then reported ‘N/A’ in learning and actions. This NSCT in their last CQC inspection (2019) was reported as being ‘Good’ overall and ‘as ‘Requires improvement’ for safety, LfDs was not mentioned in the CQC report and it is not clear if the NSCT understood LfDs statutory reporting requirements.692 As discussed in ‘This programme isn’t for us’ several NSCTs raised concerns about the requirement to report the number of deaths due to problems in care. This concern was also raised by NSCT representatives at NHS England LfDs guidance development days in 2017, where problems with the term ‘avoidability’ were raised (event attended by ZB and DB), due to the negative impact on NSCT reputation.

The four themes arising out of the qualitative analysis: ‘This programme isn’t for us’, ‘A partial truth’, ‘Not speaking the same language’ and ‘Errors in written communication’ demonstrate a degree of conflict or carelessness arising from NSCTs, which oppose NHS constitutional principles.693

Written communication such as that in the QA is used to both give information and to persuade the reader that the NSCT is safe and that quality healthcare is being provided. This communication should be clear, complete, accurate, concise and honest. A lack of attention to detail with regard to written communication, could legitimately raise concerns about data accuracy and could potentially make the general public concerned about a lack of attention to detail in clinical care. With this in mind ‘errors in written communication’ have greater significance than might be initially apparent. The small number of NSCTs highlighted in ‘a partial truth’ with approximations for the number of deaths and clear contradictions in reporting within the QA provide similar but potentially even greater concerns.

It is not usually apparent who has collected, collated and compiled the data within a QA. However, all QAs are ‘signed by the responsible person for the provider that to the best of that person’s knowledge the information in the document is accurate’. This is usually someone from the NSCT Board, often the Chief Executive and/or the Chair. NSCTs are also required to share their QAs with Commissioners, to their local Healthwatch and to Overview and Scrutiny Committees.694

Lack of consensus on how to measure quality

The measurement of quality through outcome data and the intention to use this to improve health services is commonplace in healthcare internationally. There is little consensus on which measures and how many of these indicators are best used to assess quality.695 Data collection, collation and presentation for QAs is resource intensive, but if deemed a statutory requirement should be provided in its entirety by NSCTs, even if viewed a ‘tick-box’ exercise.696 QA data can be used to improve quality of care, accountability and patient choice.697 However, when that data are not provided and there is no recourse for this, it calls into question the legitimacy of the system. It is possible that the burden of regulation has become too much for some NSCTs, particularly where there is little feedback from regulators about how the data is used or if it is actually useful.698

Published research reviewing or analysing the content of NSCT QAs is very limited. Jones et al (2017) found that NSCTs with high Quality Improvement (QI) maturity had QAs with clear internally driven priorities, in contrast to those with low QI maturity.699

The use of mortality data

The use of data related to hospital mortality to assess performance (and safety) has been a contested issue for many years. The reasons behind this include concerns about heterogeneity of data (even with risk adjustment) between NSCTs and the accuracy of data. Lilford and Provonost (2010)700 strongly argued that the use of mortality rates to assess performance for regulatory purposes was wrong, due to the lack of specificity provided by this data and unfair penalisation. They raised concerns that regulation associated with mortality rates could lead to falsification of reporting or even ‘overly aggressive care’ and that identification (and subsequent investigation) of poorly performing NSCTs could detract from improvements across all NSCTs.700 NSCTs should, however, be aware of how many people died in their care and have robust processes in place for families (and staff) to highlight concerns that a death might have been due to problems in care. All people will die and most deaths cannot be prevented. But, where a death has occurred due to problems in care, an open and honest investigation is required, actual learning about what the problem was and why it happened must occur, improvement action is needed and processes must exist to evaluate the impact of any interventions. There is a healthcare (and perhaps societal) cultural reluctance to accept death. This reluctance may have a part to play in why some NSCTs have difficulties in engaging with LfDs.701

The impact of public reporting

The evidence that NSCT QI, public reporting or even legislation improves health outcomes or patient safety is sparse.4 702 703 The costs associated with healthcare policy and interventions are sometimes overlooked in lieu of the possibility of a quick fix.704 This in addition to a lack of consideration for the complexity and importance of support for implementation may prevent health policy success.705 Evaluation of countrywide QI programmes such as the LfDs programme are not commonly undertaken. Evaluation should be built into these programmes at the planning stage and should form an essential component of all health policy.706 There is limited understanding about the effectiveness of QI interventions in supporting failing organisations. Underperforming NSCTs have less time and resources available for new QI programmes, therefore, consideration for implementation and what support might be needed should be given.707 708

Next steps

For the LfDs programme, we would recommend robust regulatory reporting oversight in addition to CQC inspections.

Strengths and limitations

This research builds on earlier research looking at LfDs and adds further insights into compliance with national statutory reporting in the NHS. This study uses a large data set of all available NSCT QA LfDs data across three consecutive years (2017–2020). This research uses a mixed-methods approach to provide a broader understanding of LfDs reporting. This study has ensured significant involvement from bereaved relatives in order to reduce researcher bias.

The research only reviews one policy (LfDs) for NHS national statutory reporting and findings may not be completely applicable to statutory reporting in general. There is potential researcher bias introduced from the viewpoint of the primary researcher as a frontline healthcare worker, particularly as initial qualitative coding was undertaken solely by this researcher.

Conclusions

When a patient dies in an NSCT due to problems in care this death must be identified, investigated and where possible actions taken to prevent future deaths. Accountability and an understanding of human fallibility must be balanced. The LfDs programme has to some extent reduced variation and improved consistency to the way that NSCTs identify, report and investigate deaths. However, 3 years into the LfDs programme the majority of NSCTs are not reporting as legally required and current evidence that LfDs reporting has improved patient safety remains elusive.

Public statutory reporting even if done well may not improve quality of care or patient safety; therefore, further research is needed to assess the value of QAs. With the LfDs programme, the inability of many NSCTs to ‘tick the box’ of basic statutory requirements limits successful evaluation. The findings from this study bring into question the validity of statutory reporting in QAs as a regulatory process.

Data availability statement

Data are available in a public, open access repository. All data used in this study are publicly available from NHS Secondary Care Trusts Quality Accounts. The dataset for this research came from NHS Trust Quality Accounts (QA), which are publicly available documents, found on NHS Trust websites and referenced within the paper itself. In some instances it may be necessary to write to the NHS trust to ask for the QA where it can not be found (or is no longer available) on the trust website.I have provided links for NHS Trusts Quality Accounts in online supplementl information. As far as we are aware, there are no conditions of reuse.

Ethics statements

Patient consent for publication

Acknowledgments

We would like to acknowledge the work of the Learning from Deaths: Learning and Action (LfDLaA) Public and Relatives steering group in this research.

References

Supplementary materials

  • Supplementary Data

    This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.

Footnotes

  • Contributors ZB designed and led the study, collected and collated LfDs reports, undertook analysis and interpreted the findings. ZB also wrote the first draft of the paper. DB contributed to planning the study, interpreting the findings and editing the paper. ZH undertook qualitative and qualitative analysis from LfDs reports. SRM contributed to planning the study, interpreting the findings and editing the paper. CV-P contributed to planning the study, interpreting the findings and editing the paper. ZB is the guarantor.

  • Funding Patient and public involvement in this research was supported by the NIHR UCL Biomedical Research Centre (award number: BRC617/PPI/ZB/104990).

  • Competing interests None declared.

  • Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.