Article Text

Download PDFPDF

Precommitting to choose wisely about low-value services: a stepped wedge cluster randomised trial
  1. Jeffrey Todd Kullgren1,2,3,4,
  2. Erin Krupka5,
  3. Abigail Schachter6,
  4. Ariel Linden7,
  5. Jacquelyn Miller4,
  6. Yubraj Acharya6,
  7. James Alford8,
  8. Richard Duffy9,
  9. Julia Adler-Milstein3,5,6
  1. 1 Center for Clinical Management Research, VA Ann Arbor Healthcare System, Ann Arbor, Michigan, USA
  2. 2 Department of Internal Medicine, University of Michigan Medical School, Ann Arbor, Michigan, USA
  3. 3 Institute for Healthcare Policy and Innovation, University of Michigan, Ann Arbor, MI, USA
  4. 4 Center for Bioethics and Social Sciences in Medicine, University of Michigan Medical School, Ann Arbor, MI, USA
  5. 5 School of Information, University of Michigan, Ann Arbor, MI, USA
  6. 6 School of Public Health, University of Michigan, Ann Arbor, MI, USA
  7. 7 Linden Consulting Group, Ann Arbor, Michigan, USA
  8. 8 Division of Family Medicine, IHA, Ann Arbor, Michigan, USA
  9. 9 Division of Quality and Performance Improvement, IHA, Ann Arbor, MI, USA
  1. Correspondence to Dr Jeffrey Todd Kullgren, VA Ann Arbor Healthcare System and University of Michigan, PO Box 130170, Ann Arbor, Michigan 48113, USA; jkullgre{at}med.umich.edu

Abstract

Background Little is known about how to discourage clinicians from ordering low-value services. Our objective was to test whether clinicians committing their future selves (ie, precommitting) to follow Choosing Wisely recommendations with decision supports could decrease potentially low-value orders.

Methods We conducted a 12-month stepped wedge cluster randomised trial among 45 primary care physicians and advanced practice providers in six adult primary care clinics of a US community group practice.Clinicians were invited to precommit to Choosing Wisely recommendations against imaging for uncomplicated low back pain, imaging for uncomplicated headaches and unnecessary antibiotics for acute sinusitis. Clinicians who precommitted received 1–6 months of point-of-care precommitment reminders as well as patient education handouts and weekly emails with resources to support communication about low-value services.The primary outcome was the difference between control and intervention period percentages of visits with potentially low-value orders. Secondary outcomes were differences between control and intervention period percentages of visits with possible alternate orders, and differences between control and 3-month postintervention follow-up period percentages of visits with potentially low-value orders.

Results The intervention was not associated with a change in the percentage of visits with potentially low-value orders overall, for headaches or for acute sinusitis, but was associated with a 1.7% overall increase in alternate orders (p=0.01). For low back pain, the intervention was associated with a 1.2% decrease in the percentage of visits with potentially low-value orders (p=0.001) and a 1.9% increase in the percentage of visits with alternate orders (p=0.007). No changes were sustained in follow-up.

Conclusion Clinician precommitment to follow Choosing Wisely recommendations was associated with a small, unsustained decrease in potentially low-value orders for only one of three targeted conditions and may have increased alternate orders.

Trial registration number NCT02247050; Pre-results.

  • primary care
  • decision making
  • ambulatory care
  • decision support, clinical
  • health services research

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Introduction

Low-value health services constitute care that in most cases does not improve patient outcomes and can lead to unnecessary harms.1 In the USA and many other countries, delivery of low-value services is common2 and costly.3 While the Choosing Wisely campaign has drawn substantial attention to evidence about what constitutes low-value care,4 awareness of evidence alone is rarely enough to change many clinical decisions5 6 without complementary strategies that target factors that underlie these decisions.7 However, little is known about how to effectively target factors that drive ordering of low-value services.8 9

Two factors that may encourage clinicians to order low-value services are the context in which clinicians typically make these decisions and cognitive biases that can arise in such contexts. First, even when clinicians are aware that certain services do not improve population-level outcomes, decisions to order these services for individual patients are often made quickly during brief clinic visits,10 in which both multitasking11 and patient requests for services12 13 are common and can over-ride intentions to avoid ordering low-value services. Second, making choices in such busy environments often requires ‘fast’ thinking that relies more on heuristics than on ‘slow’ thinking about evidence.14–17 These contexts and cognitive biases can lead to divergence from evidence-based recommendations to avoid ordering low-value services.

A promising solution to this problem that leverages insights from behavioural economics could be to encourage clinicians to start decisions about low-value services before patient encounters when their thinking is more likely to be ‘slow’ and deliberative.14–16 While engaged in this slower thinking, clinicians could be asked to commit to following specific Choosing Wisely recommendations18 19 against ordering low-value services in future clinical encounters. Behavioural economists term this strategy ‘precommitment’, or the act of committing one’s future self to a course of action.18 20–22 Precommitment has been shown to improve behaviours that can be influenced by choice environments and cognitive biases such as savings,23 academic performance,18 tobacco use24 25 and weight loss.26 27 Such precommitment interventions tie consequences to a failure to achieve the course of action.21 For clinicians who precommit to avoid ordering low-value services, these consequences include abdication of their professional responsibility to steward resources,28 which could be highlighted through point-of-care supports when ‘fast’ thinking14 is required and cognitive biases could lead to divergence from their precommitment. Importantly, this approach could be readily integrated into routine care in a range of settings and would preserve clinicians’ autonomy around ordering decisions, which can be integral to sustained intervention acceptance.29 Our objective was to test whether this behavioural economic strategy of inviting clinicians to precommit to avoid ordering low-value health services and providing supports to promote adherence to precommitment could decrease orders for potentially low-value services for three common conditions in adult primary care clinics.

Methods

Design overview

We conducted a stepped wedge cluster randomised controlled trial between 1 July 2014 and 30 June 2015 in six primary care clinics of Integrated Healthcare Associates (IHA), a multispecialty group practice in southeast Michigan in the USA. In this study design, the intervention started in a new clinic each month in a randomly assigned order. We chose this design because of its feasibility and because other potential designs would have had critical limitations.30–32 Specifically, randomisation of individual physicians or patients would have been vulnerable to cross-contamination. A parallel-design, cluster, randomised trial would have been infeasible because resource constraints necessitated starting the intervention in one clinic at a time, and could have been unacceptable to research partners as half of the clinics would have not have received the intervention.

The intervention consisted of inviting clinicians to precommit to follow Choosing Wisely recommendations against imaging for uncomplicated low back pain,33 imaging for uncomplicated headaches34 and unnecessary antibiotics for acute sinusitis.35 Clinicians who precommitted received 1–6 months of point-of-care reminders of their precommitment attached to a patient education handout, as well as weekly emails with links to resources to improve communication with patients about low-value services. We used electronic health record (EHR) data to measure the effects of the intervention, and conducted a clinician survey and qualitative analysis of clinician interviews.

Setting and participants

Eligible participants were primary care clinicians (physicians and advanced practice providers) of the six study clinics (figure 1). These six clinics (three internal medicine and three family medicine) were chosen because they were among the largest of the 11 IHA primary care clinics and because at the start of the study the 53 clinicians in these six clinics did not practise at other IHA clinics. Within each clinic, clinicians were eligible to participate if, at the time of recruitment in their clinic, they anticipated practising in that clinic for the duration of the study.

Figure 1

CONSORT flow diagram.

Randomisation and intervention

All six clinics started the study in a control period (figure 2), at the beginning of which all IHA primary care clinicians were informed by email of Choosing Wisely recommendations against imaging for uncomplicated low back pain,33 imaging for uncomplicated headaches34 and unnecessary antibiotics for acute sinusitis.35 This served to increase the likelihood that clinicians were aware of the recommendations throughout the control and intervention periods, rather than just at the time of study recruitment. These three recommendations were chosen because the targeted conditions are common in primary care practice and represent situations in which clinicians’ intentions to avoid ordering low-value services can potentially be swayed by patient requests for services.12 13

Figure 2

Study design. *Data for the months of the recruitment and precommitment processes (ie, the ‘transition months’) in each clinic were excluded from analyses. The vertical line in the boxes for these months indicates the approximate time of the recruitment and precommitment processes in each respective clinic.

After 2 months of the control period, recruitment and the intervention started in a new clinic each month in a random order determined using the user-written RALLOC module36 in Stata V.14. Clinician recruitment consisted of a 20 min in-person presentation by the principal investigator, followed by informed consent. Clinicians who provided informed consent were given the opportunity to precommit to follow the three Choosing Wisely recommendations in future patient encounters by signing a brief document (see online supplementary appendix figure 1). Because the recruitment and precommitment processes could influence practice patterns, the month of the recruitment and precommitment processes for each clinic was considered a transition period (figure 2) and excluded from analyses.32

Supplementary file 1

During the intervention period, medical assistants, whose role in usual care was to bring patients to exam rooms and elicit the reason for their visit, were trained to identify patients presenting with a symptom for which one of the three Choosing Wisely recommendations might apply (ie, patients with low back pain, headaches or nasal congestion). For such patients, medical assistants placed for the clinician a Post-it reminder of their precommitment on a patient education handout (see online supplementary appendix figures 2–4) on the way into the exam room. Training of medical assistants occurred at the beginning of the intervention period and consisted of a 10 min individual training and provision of a one-page summary of their role (see online supplementary appendix figure 5). We chose a paper-based reminder over embedding the reminder in the clinics’ EHR to ensure clinicians viewed the reminder immediately before the encounter, avoid the challenges of EHR alert fatigue37 and ensure that the intervention could in the future be implemented in a range of clinical environments irrespective of their resources and EHR adoption.38 Clinicians who precommitted also received a weekly email with links to Choosing Wisely resources to improve communication with patients about low-value services (see online supplementary appendix figure 6).

Depending on the clinic’s randomly assigned order, the intervention period lasted for 1–6 months. These varying lengths were included because of the stepped wedge study design and because even a relatively short duration of decision support could be sufficient to change clinicians’ practice patterns.39 After the end of the intervention period, each clinic was followed for an additional 3 months to evaluate the durability of any intervention effects.

Outcomes and analyses of ordering data

All study outcomes and analytic methods were prespecified. The primary outcome was the difference between the control and intervention periods in the percentage of visits with an applicable potentially low-value order: a lumbar spine X-ray, CT or MRI order in visits for low back pain33; a head CT or MRI order in visits for headaches34; and an antibiotic order in visits for acute sinusitis.35 The main secondary outcome was the difference in these percentages between the control and follow-up periods to assess whether any intervention effect was sustained in the short term following the intervention.

During intervention pretesting and study implementation, clinicians noted that efforts to reduce low-value services could lead to substitution with referrals to specialists who might order the targeted low-value service, or alternate services of questionable value. Consequently, another secondary outcome was the difference between the control and intervention periods in the percentage of visits with a potential alternate order, which we defined as an order for an opiate,40 or an orthopaedics, neurosurgery, spine clinic, neurology or pain clinic referral in visits for low back pain; an order for an opiate or butalbital,41 or a neurology or pain clinic referral in visits for headaches; and an order for a sinus X-ray,42 sinus CT,43 or ear, nose and throat referral in visits for acute sinusitis.

We identified applicable visits in EHR data using the International Classification of Diseases, Ninth Revision (ICD-9) codes developed by Premera Blue Cross and the Washington State Choosing Wisely Task Force to measure rates of potentially low-value services for low back pain, headaches and acute sinusitis in administrative data.44 45 The lists of ICD-9 codes are shown in online supplementary appendix table 1.

For each outcome we estimated differences in percentages using linear mixed models with random effects for providers nested in practices, adjusted for patient age, patient gender, patient Charlson comorbidity score at the visit,46 month and condition. We fitted linear mixed models for dichotomous outcome variables because generalised linear models did not converge.47 The choice of model terms and random effects was aided by reviewing likelihood ratio tests and Akaike’s and Schwarz’s Bayesian information criteria among competing models. We used robust SEs clustered at the clinic level. All hypothesis tests used a two-sided α of 0.05 as the threshold for statistical significance. Data were analysed in Stata V.14.

In our power calculation, we used IHA EHR data to estimate each clinician would have 29 applicable visits per month, approximately 20% of these visits would have an order for a potentially low-value service, and an intraclass correlation coefficient of 0.034. Based on these data, a sample size of eight clinicians per clinic, six clinics, and a two-sided α of 0.05 would provide 80% power to detect a difference in percentages of at least 3.7% between the control and intervention periods.

Postintervention surveys and interviews of clinicians

During the 3-month follow-up period, we asked study clinicians to complete a web-based survey about their demographic characteristics, patient panel characteristics and perceptions of the intervention components (see online supplementary appendix figure 7). Each survey participant received a $25 gift card.

During the follow-up period we also conducted semistructured telephone interviews with a random sample of study clinicians, stratified by clinic, to explore how the intervention affected their conversations with patients and clinical decision making (see online supplementary appendix figure 8). Each interview participant received a $25 gift card. We conducted 24 interviews, with at least two interviews per clinic, stopping when there were no novel responses to interview questions. Interviews were audio-recorded and transcribed for subjects who consented to recording; for subjects who did not consent to recording, detailed notes were taken. Two members of the study team independently coded all transcripts and notes and identified themes using template analysis.48

Sensitivity analyses

We conducted three sensitivity analyses. The first was to gauge the robustness of the primary outcome results to the exclusion of one clinician from clinic 1 who started to also practise in clinic 4 before the intervention started in clinic 4. The second was to gauge the robustness of the primary outcome results to alternative ICD-9 codes to identify applicable visits that could be more sensitive for identifying use of low-value services3 and capture any shifts in coding between the control and intervention periods. These more sensitive codes included the ICD-9 codes used for the main analyses and additional codes used in previous research to identify each condition in administrative data49–60 (see online supplementary appendix table 2). The third was to examine the robustness of our results by also using cluster-bootstrapped and conventional SEs.

Results

Fifty-three clinicians were eligible for the study, and 45 (85%) consented to participate (figure 1). All 45 participating clinicians (100%) chose to precommit to follow the three Choosing Wisely recommendations. Clinician characteristics are shown in table 1.

Table 1

Characteristics of study clinicians, clinics and patients

Orders for potentially low-value services

During the control period, 10.0% of 10 420 visits had an order for a potentially low-value service (table 2); trends by clinic are shown in online supplementary appendix figure 9. During the 7593 intervention period visits, there was no statistically significant decrease in the overall percentage of visits with an order for a potentially low-value service (−1.4%, 95% CI −2.9% to 0.1%; p=0.06). However, for low back pain there was a statistically significant decrease in the percentage of visits with an order for a potentially low-value service (−1.2%, 95% CI −2.0% to −0.5%; p=0.001).

Table 2

Changes in percentages of visits with orders for potentially low-value services

In sensitivity analyses, when we excluded the one clinician from clinic 1 who started to also practise in clinic 4 before the intervention started in clinic 4, we found nearly identical results (data not shown). When we used the more sensitive sets of ICD-9 codes to identify applicable visits, the sample size for the control and intervention periods combined increased from 18 013 to 95 477, and the changes in the percentage of visits for low back pain with an order for a potentially low-value service were similar (see online supplementary appendix table 3). Additionally, there were statistically significant decreases in the percentages of visits with an order for a potentially low-value service overall (−1.8%, 95% CI −2.9% to −0.7%; p=0.001), for headaches (−0.7%, 95% CI −1.3% to −0.2; p=0.006) and for acute sinusitis (−3.2%, 95% CI −5.1% to −1.3%; p=0.001). We found similar results using cluster-bootstrapped SEs (see online supplementary appendix table 4) and a statistically significant decrease in the overall percentage of visits with an order for a potentially low-value service (95% CI −2.7% to −0.2%; p=0.02) when using conventional SEs.

During the 5883 follow-up period visits, there was no statistically significant decrease in the overall or condition-specific percentage of visits with orders for potentially low-value services relative to the control period (table 2).

Orders for potential alternate services

During the intervention period there was a 1.7% increase in the overall percentage of visits with orders for potential alternate services relative to the control period (95% CI 0.3% to 3.1%; p=0.01) (table 3). Low back pain was the only individual condition for which there was a statistically significant increase in the percentage of visits with a potential alternate order (1.9%, 95% CI 0.5% to 3.3%; p=0.007).

Table 3

Changes in percentages of visits with potential alternate orders

Postintervention surveys and interviews

Twenty-one of the 44 clinicians (47.7%) who completed the survey found the precommitment itself helpful, 14 (31.8%) found the precommitment reminders helpful, 28 (63.6%) found the patient education handouts helpful, and 9 (20.4%) found the online resources to improve communication with patients helpful. Of the 24 clinicians we interviewed, 14 felt the intervention improved their conversations with patients about low-value services. One clinician, for example, said the intervention “provided extra support in my decision [to avoid ordering a low-value service] and something else for [patients] to read and understand.” Another said, “I was able to use those handouts and point to specific parts…saying this is what we believe to be evidence-based and this is the right way…to practice.” Ten clinicians felt the intervention positively changed their overall clinical decision making. For example, one clinician said, “the more difficult patients where I would have been tempted to not follow guidelines…I would more consistently follow guidelines in those tougher cases and spend the extra time.” Another said, “I think I probably [now] lean much further in the direction of not treating too early.” Additional representative quotes are shown in  online supplementary appendix table 5.

Discussion

We found that a behavioural economic strategy of asking clinicians to precommit to avoid ordering specific low-value services, followed by supports to promote adherence to this precommitment, was only associated with a statistically significant but small decrease in orders for potentially low-value services in visits for one of three targeted conditions: low back pain. Further, these intervention effects were not sustained in the 3 months after the intervention ended, and may have led to alternate orders of questionable value.

Our study’s strengths include the translation of multiple insights from behavioural economics into a novel and highly scalable intervention that targeted multiple low-value health services, the randomised design, the conduct of our trial in a community (rather than academic) practice, the measurement of unintended consequences, and the use of mixed methods to better understand the impact of the intervention on clinicians’ decision making. Weaknesses of our study include our inability to determine whether appropriateness of ordering changed as a result of the intervention because our data lacked detailed clinical information. Yet this was consistent with the main goal of our study, which was to first determine whether the intervention could change clinicians’ ordering behaviours. Future studies should supplement ordering data with detailed clinical data to examine how the intervention affects appropriateness. Our study was of a relatively short duration and stepped wedge trials are not well-suited to estimate long-term intervention effects. Thus it will be important in future research to measure longer term effects using other study designs. We tested a multicomponent intervention9 and our study was not designed to disentangle the effects of each of the precommitment intervention components, although our survey and interview results provide insight into clinicians’ perceptions of these components. As with any surveys or interviews, our findings about clinicians’ views of the intervention could have been subject to recall or social desirability biases. Our study did not examine patient experiences or economic impact, and it will be important in future work to examine how this intervention affects these outcomes. Although we conducted our study in a community group practice to enhance the external validity of our findings, our results may not generalise to all clinical settings.

Other studies have tested interventions to improve clinicians’ decisions about low-value care9 and guideline adherence,61 yet we are aware of only two trials that have tested behavioural economic strategies to discourage ordering of low-value services. The first trial tested poster-sized letters from clinicians that were placed in clinic examination rooms for 12 weeks. These posters, which stated clinicians’ commitment to avoid inappropriate antibiotic prescribing for acute respiratory infections (ARIs) and included their photographs and signatures, led to a 19.7% absolute reduction in inappropriate antibiotic prescriptions for ARIs.62 The second trial found that 18 months of ‘accountable justification’ (prompting clinicians to enter free-text justifications for prescribing antibiotics into patients’ EHRs) and ‘peer comparison’ (emails to clinicians that compared their own antibiotic prescribing rates with peers with the lowest inappropriate prescribing rates) led to 7.0% and 5.2% absolute reductions, respectively, in inappropriate antibiotic prescriptions.63 In contrast to these studies, ours tested a multicomponent intervention that aimed to reduce potentially low-value orders for multiple conditions, including an acute condition (ie, acute sinusitis) and conditions that are often chronic (ie, low back pain and headaches). For these latter two conditions the orders we targeted were for diagnostic imaging, which per order is typically more costly than an antibiotic. Further, our study examined the effects of the intervention on ordering of potentially low-value services and on clinicians’ experiences and intended consequences.

Our results have important implications for practitioners and policymakers. First, while our intervention was associated with a decrease in the percentage of visits for low back pain with an order for a potentially low-value service, the magnitude of this change was small (just 12 fewer potentially low-value orders per 1000 visits) and not sustained in the near term after the intervention ended. Nonetheless, our intervention required minimal resources and minor adjustments to clinical processes, and the results for low back pain are comparable in magnitude to the effects of the Medicare Pioneer Accountable Care Organization programme on use of low-value care.64 It is possible that sustained changes in ordering could be achieved by continuously integrating the intervention into routine care, for example as an EHR clinical decision support algorithm in higher resource settings or as a paper-based approach in lower resource settings. Second, for headaches and acute sinusitis, we only found a decrease in ordering potentially low-value services in sensitivity analyses using more sensitive ICD-9 codes. For these two conditions the increases in sample size in the sensitivity analyses were particularly large, which could have enhanced our ability to detect a statistically significant effect for these conditions. Alternatively, because the more specific codes for these two conditions were used less commonly in the intervention period than the control period (see online supplementary appendix table 6), shifts in coding or seasonal trends could have contributed to these sensitivity analysis findings. Another possibility is that the more sensitive ICD-9 codes for headaches and acute sinusitis could have less specificity for identifying use of low-value services,3 and thus when used in sensitivity analyses may have detected decreases in appropriate orders. Future studies should resolve these differences between the main and sensitivity analyses by examining how this intervention changes the appropriateness of care, which could be accomplished using measures that combine ordering and clinical data.4 Third, our survey and interview findings suggest our results were driven at least in part by changes in clinician–patient interactions. At its core, the Choosing Wisely campaign encourages clinicians and patients to question services that may not yield benefits and could lead to harms,65 and an effective way to facilitate such conversations could be provision of point-of-care handouts by clinicians who had precommitted to avoiding low-value care. Fourth, the intervention may have led to more referrals to other clinicians who could order the targeted services or other services of questionable value, particularly in visits for low back pain. Because the urge to ‘do something’ can be strong in clinical practice,66 our results suggest it will be crucial to monitor for and seek to ameliorate such unintended consequences in efforts to reduce low-value care.

In conclusion, a behavioural economic strategy of asking clinicians to precommit to specific Choosing Wisely recommendations paired with decision supports, although theoretically promising and highly scalable, only yielded a small and unsustained decrease in potentially low-value orders for one of three targeted conditions. The intervention also likely led to a small increase in alternate orders for services of questionable value. These results highlight the potential of behavioural economic strategies to be integrated into clinical workflows with minimal resources as well as some of the important challenges to be confronted in interventions to reduce ordering of low-value services. As momentum to reduce delivery of low-value care continues to grow worldwide,4 66 more such research is needed to identify low-cost, scalable strategies to achieve this goal while minimising the potential for unintended consequences.

Acknowledgments

We thank A Jay Holmgren, Eric Pfeifer and Dori Cross for assistance with coding the interview data and Pavel Vinarov for assistance with assembling the administrative data.

References

Footnotes

  • Funding This work was funded by the Robert Wood Johnson Foundation (grant number 71475). Support was also provided by the US Department of Veterans Affairs (VA). Dr Kullgren was supported by a VA Health Services Research & Development Career Development Award (grant number 13-267). The funding sources had no role in the design and conduct of the study; collection, management, analysis and interpretation of the data; or preparation, review or approval of the manuscript.

  • Competing interests JTK has received research grants from the US Department of Veterans Affairs Health Services Research & Development (HSR&D) Service, Robert Wood Johnson Foundation, Donaghue Foundation, and Center for Medicare & Medicaid Services (CMS); received consulting fees from SeeChange Health and HealthMine; and received speaking honoraria from AbilTo. EK has received research grants from the Robert Wood Johnson Foundation, Donaghue Foundation and National Science Foundation. JM has received research grants from the Donaghue Foundation, Agency for Healthcare Research and Quality, Commonwealth Fund, CMS, John A Hartford Foundation, California Health Care Foundation, DHHS – Office of the National Coordinator for Health IT, and Blue Cross Blue Shield of Michigan Foundation. JM is on the advisory board of QPID Health.

  • Ethics approval The protocol was approved by the institutional review board of the University of Michigan Medical School (HUM00087820).

  • Provenance and peer review Not commissioned; externally peer reviewed.

  • Data sharing statement Patient-level data, statistical code and the trial protocol are available from the corresponding author at jkullgre@med.umich.edu. Consent for data sharing was not obtained from participants but the presented data are anonymised and risk of identification is low.

Linked Articles