Article Text

Download PDFPDF

‘Start smart’: using front-line ownership to improve the quality of empiric antibiotic prescribing in a paediatric hospital
  1. Robert Cunney1,
  2. Michelle Kirrane-Scott2,
  3. Aisling Rafferty2,
  4. Patrick Stapleton1,
  5. Ikechukwu Okafor3,
  6. Roisin McNamara3
  1. 1Microbiology, Temple Street Children's University Hospital, Dublin, Ireland
  2. 2Pharmacy, Temple Street Children's University Hospital, Dublin, Ireland
  3. 3Emergency Medicine, Temple Street Children's University Hospital, Dublin, Ireland
  1. Correspondence to Dr Robert Cunney; robert.cunney{at}cuh.ie

Abstract

Infection is the most frequent indication for non-scheduled admission to paediatric hospitals, leading to high levels of empiric antibiotic prescribing. Antibiotic prescribing in line with local guidelines, improves patient outcomes, reduces adverse drug events and helps to reduce the emergence of antimicrobial resistance. We undertook an improvement project at Temple Street Children’s University Hospital targeting documentation of indication and compliance with empiric antibiotic prescribing guidelines among medical admissions via the emergency department (ED). Results of weekly audits of empiric antibiotic prescribing were fed back to prescribers. Front-line ownership techniques were used to empower prescribers to generate ideas for change, such as regular discussion of antibiotic prescribing issues at weekly clinical meetings, antibiotic ‘spot quiz’, updates to prescribing guidelines, improved access and promotion of a prescribing app, laminated guideline summary cards, and reminders and guideline summaries at a point of prescribing in ED. Documentation of indication and guideline compliance increased from a median of 30% in December 2014 to 100% in March 2015, and was sustained at 100% to September 2016, then 90% to December 2017. The intervention was associated with improvements in non-targeted indicators of prescribing quality, an overall reduction in antimicrobial consumption in the hospital, and a €105 000 reduction in annual antimicrobial acquisition costs. We found that a simple, paper-based, data collection system was effective, provided opportunities for a point-of-care interaction with prescribers, and facilitated weekly data feedback. We also found that using a pre-existing weekly clinical meeting to foster prescriber ownership of the data, allowing prescribers to identify possible tests of change, and exploiting the competitive nature of doctors, led to a rapid and sustained improvement in prescribing quality. Awareness of local prescribing processes and culture are essential to delivering improvements in antimicrobial stewardship.

  • antibiotic management
  • complexity
  • pdsa
  • quality improvement
  • paediatrics

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Problem

Infections are the most common indications for non-scheduled admissions to paediatric hospitals.1 2 Although the majority of acute infections in children are viral, empiric antibiotic therapy is frequently prescribed until a bacterial cause can be excluded, particularly in younger infants presenting with sepsis.3 Antibiotic prescribing in line with local prescribing guidelines is associated with improved patient outcomes, including reduced mortality, in addition to reducing selection pressure for antimicrobial resistance (AMR), adverse drug events and financial cost.4 5

Temple Street Children’s University Hospital (TSCUH) is a paediatric hospital providing secondary care for the paediatric population of North Dublin city and county, along with a number of national specialty tertiary referral services.

TSCUH has had guidelines in place for empiric antibiotic prescribing for many years. Self-reporting by consultant paediatricians and junior doctors working at TSCUH suggested that the guidelines were well liked and widely adhered to. However, a 2014 national Point Prevalence Survey (PPS) on antimicrobial use found that the indication for empiric antibiotic therapy among inpatients in TSCUH was documented only 86% of the time, and was compliant with local guidelines only 63% of the time. As these two measures are key components of the national ‘start smart, then focus’ antibiotic prescribing care bundle we decided to make these the target of an improvement project.6 Commencing in January 2015, our aim was to ensure ≥90% of children, admitted to the paediatric medical services via the emergency department (ED) at TSCUH and receiving empiric antibiotic therapy, had the indication for therapy documented in the medical admission record and antibiotic choices in line with TSCUH empiric prescribing guidelines, by 1 June 2015.

Background

Rational antibiotic prescribing is a key component in addressing the global threat of AMR.7 In 2011, the UK Department of Health developed an evidence-based care bundle for rational empiric prescribing of antibiotics in hospital care (‘start smart, then focus’).8 This approach has been adapted in other countries, including Ireland.6

Antibiotics are frequently prescribed in paediatric hospital care, with 36%–42% of inpatients in Irish paediatric hospitals being on antibiotics at any given time (A Oza, Health Protection Surveillance Centre, personal communication). Successful antimicrobial stewardship programmes generally employ a combination of preauthorisation for antibiotic prescribing, feedback of prescribing data to individuals or groups of prescribers, educational interventions, and reviewing of prescribing decisions through direct interaction with prescribers (‘academic detailing’).9 10 Recent studies have suggested that data feedback and direct interaction with prescribers appear to have the most lasting impact on prescribing practice.11 12 In particular, focusing on behavioural aspects of prescribing decisions appear to be important in delivering lasting changes in prescribing practice.13 14 Antimicrobial stewardship programmes in paediatric settings have been shown to reduce antimicrobial consumption, drug costs and medication errors, without adverse impacts on patient safety.12 15

Front-line ownership (FLO) is a process derived from principles of complexity science, in which Liberating Structures are used to engage individuals and groups in improvement activities.16 17 Central to the concept of FLO is the recognition that the delivery of improvements in clinical practice requires ownership, and subsequent development of solutions, by those closest to a particular problem.18 This approach has been shown to produce sustained improvements in a number of areas of hospital practice, such as hand hygiene compliance.19

Measurement

Baseline measurement of the quality of antimicrobial prescribing at TSCUH was obtained from the annual national PPS on antimicrobial use in Irish hospitals.20 Results from the October 2014 PPS showed that only 63% of inpatient antibiotic therapy in TSCUH was in line with local prescribing guidelines, and 86% had a documented indication.

To provide data for improvement for our project, we collected data on whether or not the following four indicators of prescribing quality were documented in the admission medical notes:

  1. Indication for antibiotic therapy.

  2. Antibiotic dose(s).

  3. Route of administration.

  4. Planned duration of antibiotic therapy (or documented stop/review date).

We also assessed whether or not the choice of antibiotic therapy was compliant with our local prescribing guidelines. We collected data on the first 10 children identified during antimicrobial stewardship ward rounds who had been admitted via ED to the medical paediatric service and prescribed empiric antibiotic therapy.

In keeping with FLO concepts, we explored what data, and data feedback methods, were most meaningful to front-line staff. Based on discussions with prescribers, we chose to feedback data on documentation of indication and compliance with guidelines as a single combined measure. If the indication for antibiotic therapy was documented, and the choice of therapy was compliant with guidelines, this was given a score of 1. If either indicator was not present, this was given a score of 0. Results were expressed as a percentage (based on the weekly review of 10 cases).

While developing our measurement approach, we also focused on the quality and reliability of our data collection. We initially collected data on Mondays, but found that this produced results that were biassed towards particular admitting paediatric teams. We then changed out data collection to be spread out over different days, and also varied the order of wards sampled to ensure an even distribution of medical specialties and patient age groups.

We had initially attempted to identify children for inclusion in the weekly audits via the ED patient information system, but this proved cumbersome and did not include data on all of the included indicators. We, therefore, changed to incorporating paper-based data collection into antimicrobial stewardship ward rounds.

During our initial data collection period (December 2014), and prior to introducing any active interventions, we found that only 30% of children included in our audits had empiric antibiotic therapy that had both a documented indication and compliance with local prescribing guidelines.

In addition to the weekly audits, we monitored overall antibiotic consumption and expenditure using data extracted from the pharmacy information system.

Design

We established a project team, comprising a consultant microbiologist (RC), antimicrobial clinical pharmacist (MK-S, later replaced with AR), ED consultants (IO and RM) and a medical microbiology specialist registrar (PS). We also held meetings with consultant paediatricians, junior doctors, data specialists, and ED and inpatient ward clinical nurse managers. The project team identified primary and secondary drivers for rational antibiotic prescribing and, from these, identified a number of potential interventions to improve prescribing practice. These were based on examples of interventions reported in the literature, discussion with antimicrobial stewardship team members from other hospitals, discussions with individual prescribers and direct observation of prescribing practice (primarily in the ED).

We initially planned to put a range of educational interventions in place on the assumption that inappropriate antibiotic prescribing was largely related to knowledge gaps among prescribers regarding issues such as sepsis management, adverse drug reactions and AMR. However, because we wanted to follow the principles of FLO, we chose to allow prescribers to identify interventions that they felt were most likely to be effective.

Strategy

Plan, Do, Study, Act cycle (PDSA) 1

In January 2015, we commenced weekly feedback of audit data at a pre-existing clinical meeting, attended by consultant paediatricians and junior doctors every Monday morning. Our hypothesis was that weekly presentation of a single combined indicator of prescribing quality would engage prescribers and promote awareness of the need to improve the quality of empiric prescribing. We also hypothesised that we could maximise engagement by making the data presentations brief (2–3 min) and incorporating them into an existing meeting. Data were displayed as a percentage on a run chart. We initially used PowerPoint (Microsoft, Redmond, Washington, USA) slides, but found that attendees were more engaged with the data when this was replaced by hand drawing the run chart on a white board. This initial data feedback was associated with an improvement in our audit measure from our baseline median of 30%–50%. After 4 weeks of brief, weekly data feedback, we sensed that prescribers were taking ownership of the data, based on interactions at the weekly meeting (eg, prescribers moved from asking ‘how is your project going?’ to asking ‘how are we doing?’). At this point we introduced FLO techniques to identify potential interventions.

PDSA 2

We used Liberating Structures (‘Impromptu Networking’ and ‘1-2-4-All’) to engage prescribers in discussing their responses to audit data, identify areas of agreement in relation to rational antibiotic prescribing (‘Minimum Specifications’), and suggest potential tests of change.19 These engagement sessions were incorporated into the Monday clinical meeting, along with the weekly data feedback, and were limited to a maximum of 10 min per session. One suggestion was to use the inherent competitive nature of clinicians as a source of leverage for change, which we did by comparing audit results from the current versus previous cohort of junior doctors. We also developed a ‘spot quiz’ on empiric antibiotic prescribing, the answers to which could be found in the hospital empiric prescribing guidelines. These interventions were associated with an increase in our weekly audit measure to a median of 60%.

PDSA 3

Prescribers identified ready access to guidance on common antibiotic prescribing indications as a key factor in aiding prescribing decisions, with a roughly even split between those who preferred electronic versus printed access to guidance. Based on this, we updated our prescribing app to make the local empiric antibiotic prescribing guidelines more prominent, and provided clear instructions for all prescribers for downloading and using the app. In addition, we produced a laminated card, designed to be attached to staff identification badges, which summarised the most frequent indications for empiric antibiotic therapy. We incorporated suggestions arising from our engagement sessions with prescribers regarding the choice of conditions to be included, level of detail required and other design aspects (such as use of colour), in the laminated cards. We also fed back to prescribers on draft designs of the laminated card, to ensure prescribers had a clear sense of ownership in the design process. This intervention was associated with our weekly audit measure reaching 100%.

PDSA 4

From our weekly audit activity, we found that the vast majority of empiric antibiotics for children admitted from the community were prescribed in the ED by junior doctors from the admitting medical team. We interviewed and observed practice among consultants and junior doctors, and determined that most junior doctors chose to complete their admission and prescribing documentation at the same location in the ED. We, therefore, mounted a laminated poster-sized version of the guideline summary card at this location. This was associated with our weekly audit measure being maintained at 100% (figure 1).

Figure 1

Annotated run chart of proportion of empiric antibiotic prescribing with documented indication and compliance with local prescribing guidelines, 1 December 2014 to 6 July 2015. ED, emergency department.

When the weekly audit data first reached 100% compliance with documentation of indication and compliance with guidelines, we marked this by providing sweets at the following Monday clinical meeting (‘Quality Street’, Rowntree Mackintosh) to celebrate success and reinforce good practice.

Of note, some of the interventions we had planned within the project team were rejected by prescribers during engagement sessions. For example, we had planned to attach reminders regarding rational antimicrobial prescribing to copies of the British National Formulary for Children (BNFc) provided to all clinical areas (a similar intervention had contributed to improvements in hand hygiene compliance in a previous improvement project involving day case admissions). Prescribers fed back, however, that they were more likely to use the hospital prescribing app for common antibiotic indications, that attaching reminders to the BNFc could interfere with prescribing practices for other drugs, that copies of the BNFc may not always be located at the point of prescribing and that the intervention would be difficult to sustain.

We also found that the planned educational interventions proved unnecessary. Of note, we did not highlight adverse drug events, AMR or cost containment, in any of our engagement sessions with prescribers.

We carried out a further engagement session in July 2015, and interviewed consultant paediatricians and junior doctors to identify which components of the improvement process they felt had most impacted on promoting rational prescribing. Participants consistently identified weekly feedback of audit results, allowing prescribers to suggest interventions and provision of the laminated reminder card, as the most important components.

Results

Our initial aim had been to achieve ≥90% compliance with documentation of indication and prescribing guidelines by June 2015. We reached 100% compliance by March 2015, and this was maintained for a further 18 months. We noted a decrease in compliance in late 2016, which coincided with a temporary vacancy in our antimicrobial pharmacist post, but this subsequently improved and was maintained at a median of 90% to the end of 2017 (figure 2).

Figure 2

Run chart of monthly proportion of empiric antibiotic prescribing with documented indication and compliance with local prescribing guidelines, December 2014 to December 2017.

We subsequently reduced the frequency of audits and feedback from weekly to 3 weekly (from 13 April 2015) to monthly (from 6 July 2015 onwards). Compliance was maintained at 100% over the following 18 months, including through three further changeovers of junior doctors (July 2015, January 2016 and July 2016).

We found that there was an overall reduction in antibiotic consumption following the interventions (72.5 Defined Daily Doses per 100 bed-days used in 2014, compared with 62.1 in 2015). This was associated with a sustained decrease in antibiotic expenditure for the hospital (compared with data for 2014, this equated to €105 000 decrease in expenditure for 2015). Although we did not target the overall level of antibiotic consumption, and there were other antimicrobial stewardship interventions introduced during this time, we believe that our improvement project contributed to this reduction through increased awareness of good prescribing practice among prescribers. We also documented this sustained decrease in overall antimicrobial consumption occurred while the national median consumption for acute hospitals was increasing (online supplementary material).

Supplemental material

We subsequently participated in two further national point prevalence surveys, which also demonstrated an improvement in the proportion of antibiotic prescribing courses that had a documented indication and were in line with local prescribing guidelines. As we had found in our weekly audits, these surveys also demonstrated an improvement in prescribing quality indicators (such as documentation of planned duration or stop date) that were not included in data feedback or targeted interventions (figure 3).

Figure 3

Selected data relating to antibiotic prescribing at TSCUH from national point prevalence surveys in October 2014, 2015 and 2016. TSCUH, Temple Street Children’s University Hospital.

Lessons and limitations

We believe that our focus on using FLO was a large factor in reaching, and surpassing, our initial project aim. Allowing prescribers time to understand and accept ownership of the weekly audit data meant that we were more likely to get positive engagement in the process when it came to seeking ideas for change. Goldmann describes seven rules for engaging clinicians in quality improvement.21 Applying these rules probably further contributed to our success, as follows:

  1. Emphasising improvement, rather than assurance: we avoided any discussions around quality assurance (eg, compliance with national standards), but rather focused on local data and interventions.

  2. Avoiding ‘mystical’ language: we avoided using any technical language associated with quality improvement methods (eg, model for improvement, PDSA, Lean, etc), which might have alienated prescribers unfamiliar with these methods or terminology.

  3. Relating improvement work to what matters to clinicians: we focused on using the sense of professional pride, along with the competitive nature of clinicians, as levers in our improvement process. Incorporating data feedback and engagement sessions into a clinical case discussion meeting also helped to set the improvement project in the context of direct patient care.

  4. Accommodating clinician’s workload and schedule: we incorporated our data feedback and engagement sessions into a pre-existing weekly meeting attended by paediatricians and junior doctors.

  5. Being upfront about the fiscal agenda: we did not emphasise the financial benefits of the improvement project during the initial engagement sessions, though we did include feedback at subsequent sessions on how cost savings from the project helped to support appointment of an additional clinical pharmacist.

  6. Providing relevant data: we tested a variety of metrics for feedback and chose a combined measure that resonated with and was well understood by prescribers. The practice of hand drawing the run chart each week also appeared to help engage prescribers, by making the feedback process relatively informal and introducing a ‘performance’ element.

  7. Emphasising the academic case for quality improvement: while we did not emphasis the academic potential of the project as part of the initial engagement sessions, we did feedback to prescribers when the project was presented at national or international meetings, and when it was selected as a finalist for the 2016 Health Service Executive (HSE) Excellence Awards. Results from the project were also presented at hospital research and quality improvement seminars.

A core concept in FLO is that of ‘Wise Crowds’, whereby a sufficiently large and diverse group is likely to have a more detailed and nuanced knowledge of how a healthcare process or system works.17 Thus, with appropriate facilitation, such groups are often better placed to identify problems and potential solutions than a smaller group of subject matter experts. We found this to be the case in our project, as prescribers were able to identify key drivers for improvement that we had not considered in our initial project planning (eg, leveraging the competitive nature of clinicians), along with potential interventions to support improvement. By not presenting the project as a ‘fait accompli’, this approach also helped to foster ownership of the project among prescribers. This approach has been shown to be more effective in delivering sustained improvement, compared with the approach of seeking ‘buy-in’ for a predetermined set of interventions.19 Similarly, a recent study found that allowing physicians to choose local antimicrobial stewardship interventions was associated a sustained improvement in the quality of antimicrobial prescribing.22

A 2017 Cochrane review on interventions to improve antimicrobial prescribing practices for hospital inpatients found that ‘interventions that included feedback were more effective than those that did not. However, there were too few studies with goal setting or action planning to assess their effect in addition to feedback.’23 A 2012 Cochrane review on the impact of audit and feedback found ‘empirical evidence from non-health literature to suggest that goal setting can increase the effectiveness of feedback, especially if specific and measurable goals are used’.24 Setting a clear goal, in the form of a SMART aim, and providing regular feedback to prescribers were key components of our project. We also incorporated action planning, as we continued to engage with front line staff to identify additional interventions that could be made to consolidate and sustain improvements once the initial goal had been met.

Toma et al have recently produced a framework for balanced accounting of the impact of antimicrobial stewardship interventions.25 This framework includes the consideration of unexpected effects (both positive and negative) after the intervention has been implemented. Using FLO allowed us to achieve a balanced account by facilitating the modification of the initial planned interventions over time in an iterative fashion. This led to the identification of several pleasant surprises, such as leveraging the competitive nature of clinicians, the impact of hand-drawing run charts and not requiring the planned educational interventions. This is in keeping with the derivation of FLO from complexity science, which includes the principle of ‘emergence’. Emergent properties arise from the interaction of individual elements in a complex adaptive system, are greater than the sum of the parts and are difficult to predict by studying the individual elements.16 Thus, unexpected effects are largely unpredictable at the start of an intervention, but are still to be expected.

We had expected to see a reduction in compliance with our audit measures following the 6 monthly changeover of junior doctors, and had planned for additional engagement sessions around these times to reinforce the improvement process. However, we were pleasantly surprised to see that compliance remained at 100% in our ongoing audits, despite the influx of a large number of new staff. Discussions with junior doctors who were involved in the initial improvement process, and with junior doctors who started in the hospital in July 2015, appeared to confirm that we had witnessed a change in the culture around empiric antibiotic prescribing. Junior doctors who were involved in the initial phase of the improvement expressed a sense of pride in the results, ownership of the process and confirmed that the prescribing supports we had put in place fit well into their practice and were easy to use. Junior doctors starting in the hospital in July 2015 described being informed by their peers that following the empiric prescribing guidelines and using the prescribing tools, such as the laminated guideline summary card, was considered the norm for the hospital. Rogers described the concept of a ‘tipping point’ whereby once there is 10%–20% adoption of a new product/practice in a given population the spread and adoption of the new product/practice becomes self-sustaining.26 We believe that the time put into fostering front line ownership of empiric antibiotic prescribing paid off by ensuring we reached this tipping point relatively quickly, once the interventions were introduced.

A key factor in the success of this improvement process was being able to incorporate our data feedback and engagement sessions into a pre-existing weekly medical meeting. There was already a strong culture of open discussion and learning from clinical scenarios at this meeting, which facilitated engaging prescribers in addressing empiric antibiotic prescribing. The generalisability of our improvement process, however, may be limited if such an open forum does not exist for other groups of prescribers.

One of the challenges we faced during the initial phase of the project was in relation to obtaining data for improvement. We discovered that the ED electronic record system was designed to focus on presenting complaints, triage and flow within the ED, and that extracting data on admission diagnoses and empiric therapy was cumbersome. We chose to instead incorporate data collection into daily ward rounds. This had the advantage of providing qualitative feedback on prescribing practice through discussions with nursing and medical staff, prompted by requesting data on which children on the ward were receiving antibiotics. A disadvantage to this approach, however, was that we were not able to collect data on children admitted with an infectious diagnosis who were not commenced on antibiotic therapy (data that may have been available via the ED electronic patient record system).

We encountered confounding factors in our data, particularly when there was an over-representation of children admitted under an individual paediatric team or within a particular age group (eg, neonates). This was addressed by spreading our data collection out across different days of the week, and by ensuring all wards were sampled. Including a higher number of children in each weekly audit would have also helped to address this issue. However, because we excluded surgical admissions, admissions under subspecialties, and children who were on antibiotic therapy directed by the microbiology/infectious diseases service, we found that it was difficult to consistently include more than ten children in each weekly audit. While this meant we had a small sample size for our audit data, it did allow us to feed data back on a weekly basis. Having weekly data, despite its limitations, ensured the results were close to real time for prescribers, and probably helped to foster ownership. We were also able to demonstrate an overall reduction in antibiotic consumption and expenditure that was temporally related to our interventions, and subsequent participation in national antibiotic point prevalence surveys corroborated the improvements in documentation of indication and compliance with empiric guidelines that we had seen in our weekly audit data.

We believe that there were a number of ‘soft’ skills that contributed to the success of this project, including communication techniques, accessibility and visibility of the antimicrobial stewardship team. This probably explains a dip in compliance with our quality indicators seen in late 2016, following a period when the hospital was without an antimicrobial pharmacist. While we demonstrated sustainability of our improvement, maintaining this over a longer term does require having core team members in place and actively engaged in antibiotic stewardship activities.

Finally, we found that the high profile of the improvement project and sense of ownership among prescribers helped to foster a wider interest in quality improvement among clinical staff. We found that demonstrating a successful quality improvement project acted as a ‘proof of concept’ that quality improvement methodologies work. While we cannot claim that our project was the sole driver for this, we have seen a steady increase in the number of successful quality improvement projects being undertaken by clinical staff from across the hospital.

Conclusion

We identified that the quality of empiric antibiotic prescribing in our hospital was suboptimal, and successfully delivered a sustained improvement that also had positive impacts outside of the quality indicators that were the focus of our improvement process. We have continued to carry out monthly audits of empiric prescribing, and feed these data back at the Monday morning clinical meeting.

We have taken steps to share the learning from our project in other hospitals. This has included presentations to professional societies, individual hospital teams, and at national and international quality and patient safety conferences, along with interactive sessions with hospitals as part of an HSE ‘roadshow’ highlighting improvement projects. The project also formed the basis of a breakout collaborative on hospital antibiotic stewardship, involving teams from 10 acute hospitals around the country. Participating hospital teams applied FLO techniques and the model for improvement, based on the learning from our project, with 8/10 demonstrating improvement in their targeted areas of antibiotic prescribing over a 1-year period. Combining FLO with other quality improvement methods can produce sustained improvement in antibiotic prescribing.

Acknowledgments

The authors would like to acknowledge the support and participation by consultant paediatricians (in particular, Alf Nicholson and Louise Kyne), junior doctors and clinical nurse managers (in particular, Brigid Weir CNM3) at Temple Street CUH. This work was carried out, in part, as a component of the Scottish Quality and Safety Fellowship (SQSF, Cohort 7), and RC would like to acknowledge the support of the SQSF team, in particular Simon Watson (SQSF Clinical Lead) and John Fitzsimons (fellowship mentor).

References

  1. 1.
  2. 2.
  3. 3.
  4. 4.
  5. 5.
  6. 6.
  7. 7.
  8. 8.
  9. 9.
  10. 10.
  11. 11.
  12. 12.
  13. 13.
  14. 14.
  15. 15.
  16. 16.
  17. 17.
  18. 18.
  19. 19.
  20. 20.
  21. 21.
  22. 22.
  23. 23.
  24. 24.
  25. 25.
  26. 26.

Footnotes

  • Contributors RC designed and coordinated the project, carried out data collection and analysis, and drafted the manuscript. MK-S codesigned the project plan and interventions, carried out data collection and analysis, and contributed to drafting the manuscript. AR carried out data collection and analysis, and contributed to drafting the manuscript. PS, IO and RM contributed to the initial project design, codesigned project interventions and contributed to drafting the manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient consent for publication Not required.

  • Ethics approval The project was approved as a quality improvement project by the hospital’s Clinical Audit Review Committee.

  • Provenance and peer review Not commissioned; externally peer reviewed.