Article Text
Abstract
Objective The American Board of Pediatrics’ (ABP) maintenance of certification (MOC) programme seeks to continue educating paediatricians throughout their careers by encouraging lifelong learning and continued improvement. The programme includes four parts, each centring on a different aspect of medical practice. Part 4 MOC centres on quality improvement (QI). Surveys by the ABP suggest that paediatricians are dissatisfied with aspects of part 4, but their reasons are unclear. This study sought to explore factors contributing to dissatisfaction with part 4 by focusing on performance improvement modules (PIMs), a popular means of achieving part 4 credit.
Methods The study used cross-sectional purposive sampling drawing from US physicians working in a range of practice settings: private outpatient, hospital, academic and low-income clinics. The sampling frame was divided by practice characteristics and satisfaction level, derived from a five-point Likert item asking about physician satisfaction regarding a recent PIM. In-depth interviews were conducted with 21 physicians, and the interview data were coded, categorised into themes and analysed using a framework analysis approach.
Results Paediatricians expressed nuanced views of PIMs and remain globally dissatisfied with part 4, although reasons for dissatisfaction varied. Concerns with PIMs included: (1) excessive time and effort; (2) limited improvement and (3) lack of clinically relevant topics. While most agreed that QI is important, participants felt persistently dissatisfied with the mechanics of doing PIMs, especially when QI tasks fell outside of their typical work regimen.
Conclusions Paediatricians agreed that part 4, PIMs, and QI efforts in general still lack clinical relevance and need to be more easily incorporated into practice workflow. Clinicians specifically felt that PIMs must be directly integrated with physicians’ practice settings in terms of topic, data quality and metrics, and must address practice differences in time and monetary resources for completing large or complex projects.
- Quality improvement
- Paediatrics
- Qualitative research
Data availability statement
Data are available on reasonable request.
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
What is already known on this topic?
Online performance improvement modules (PIMs) that satisfy American Board of Pediatrics continuing certification requirements draw considerable criticism. Physicians often assert that projects are not relevant to them or their practices, do not improve patient care and are too time consuming.
What this study adds?
This study qualitatively documents the context behind paediatricians’ dissatisfaction with PIMs, and identifies factors associated with dissatisfaction, the most salient barriers to completing PIMs and quality improvement generally, and recommendations by providers for improving PIMs. The study reveals context-specific details behind paediatricians’ concerns about relevance, quality improvement and effort.
How this study might affect research, practice or policy?
Insights from paediatricians interviewed in this study can drive online improvement, urging a model redesign that could improve relevancy and patient care, and potentially improving overall maintenance of certification opinion.
Introduction
The American Board of Medical Specialties (ABMS) maintenance of certification (MOC) requirements aim to ensure that physicians have the requisite professional knowledge and skills to stay current within their medical specialty. In paediatrics, MOC is divided into four parts: (part 1) professional standing, (part 2) lifelong learning and self-assessment, (part 3) cognitive expertise and (part 4) improving professional practice. Despite the intent of MOC and some positive perceptions among physicians,1 2 many physicians feel that MOC is irrelevant or a poor assessment of their skills,3–5 and in the last several years, many physicians have begun to seriously question the value of MOC programmes.2 6 Creating more effective MOC programmes is an essential element of improving quality generally considering that participation in the American Board of Pediatrics’ (ABP) MOC programme is required for board certification, and that a large number of paediatricians are board certified (approximately 71 000),7 making the programme an important conduit for training and educating paediatricians. Additionally, well-trained paediatricians are essential to the health of children across society and programmes that paediatricians find genuinely useful can be a key driver for improving healthcare quality. Therefore, it is important to better understand which aspects and factors of MOC paediatricians find dissatisfying, and in doing so, learn how MOC might be improved to address the overall goal of continuous improvement in the delivery of paediatric care.
The parts of MOC can be fulfilled in various ways and satisfaction differs across these different elements. Part 4, the newest section and the most time-intensive to achieve, requires a physician to complete quality improvement (QI) projects and report outcomes since significant evidence exists that support positive patient outcomes with physician engagement in QI.8 While physicians can create their own QI project, and many do, a large number of paediatricians—approximately 7000 annually according to internal data—complete online performance improvement modules (PIMs) instead, which are ready-made QI projects developed by the ABP.9 According to ABP data, diplomats are generally satisfied with parts 1–3 of MOC, especially since the removal of high-stakes testing requirements (part 3). However, reactions to part 4 remain mixed, with some diplomats expressing extreme dissatisfaction while continuing to participate in the modules.
Despite frustrations with part 4 and PIMs in particular, research has measured significant and specific benefits of MOC-associated QI to paediatric care, with improvements in areas such as developmental screening,10 weight management,11 12 HPV vaccination rates,13 research recruitment,14 confidential care15 and supervision of resident education.16 Other studies have shown that completion of online modules leads to measurable QI in paediatric care.9 15 Specific to PIMs, Arvanitis et al demonstrated that not only are PIMs frequently used to satisfy the part 4 requirement, but they lead to significant improvements in physician-reported and parent-reported measures of quality.17 However, differences in use and effectiveness of PIMs for improving quality may occur due to practice characteristics such as academic and non-academic settings, where academic settings typically have more health system resources and QI expertise available to facilitate QI projects. Considering the positive benefits from QI and PIMs, it follows that removing barriers to PIM and QI participation and facilitating high-quality QI efforts in clinics should be an important goal for the field of paediatrics moving forward.
The ABP currently has 15 PIMs on diverse topics ranging from addressing health literacy to improving influenza immunisation rates, which are included with the cost of MOC enrollment. Given prior work with existing surveys that were unable to document reasons for non-participation in QI,4 the aim of this study was to conduct qualitative interviews with paediatricians to understand their perspectives on the PIMs as well as MOC more generally, and especially part 4, to determine potential barriers and facilitators to engaging in PIMs and QI more broadly in order to better understand how to design PIMs that add value to paediatricians as they continuously achieve certification.
Methods
Study sample
The study sample was selected from a cross-sectional sampling frame of US paediatricians created in conjunction with the ABP. Prior to beginning the study, we received approval from the institutional review board of the University of Florida (UF). The sampling frame came from a group of paediatricians provided by the ABP who had either completed a PIM or had made significant progress on a PIM (defined as completing at least one data collection cycle) during the period from 1 January 2016 to 31 December 2017 (n=9272). Using PIM registration data, the ABP prepared the sampling frame using demographic data (age, sex and race/ethnicity), practice type (academic or non-academic), specialty (general or specialist paediatrician), number of years in practice (<10 years or >10 years), and PIM completion status. Finally, a satisfaction category was added, which we defined by recoding a 5-point Likert item, ‘How would you rate your overall satisfaction with this CME activity?’ into ‘satisfied’ (4–5) and ‘not satisfied’ (1–3). Descriptive analyses of the population and sample were performed using SAS software V.9.4. Importantly, there were no statistical associations between demographic or practice characteristics and PIM completion (online supplemental material 1).
Supplemental material
To assess differences regarding barriers and facilitators through qualitative interviews and include diverse perspectives, we used a purposeful sampling method to include paediatricians across the characteristics listed above. Initially the ABP sampled randomly from each group, approaching those selected via email, asking their permission for UF staff to contact them. A UF research coordinator (JJH) then contacted those who agreed, and interview participants were offered a US$100 gift card as honoraria. As the study progressed, the interviewer directed the ABP to sample more heavily from different subgroups so as to thoroughly explore emerging themes. Sampling and recruitment continued concurrently with data collection until the data were thoroughly described by our set of codes (ie, saturation). The recruitment spanned October 2018 to March 2019. We achieved a response rate of 20.7% from the original 103 we contacted, (57 did not respond, 25 refused, 27 agreed, but 21 completed).
Qualitative methods
All authors participated in developing the semistructured interview guide (online supplemental material 2), and Dr. Hendricks, a male with a PhD and background in qualitative research working as a research coordinator at the time, conducted all telephone interviews. Choosing an interviewer without prior clinical experience reduced preconceived notions about MOC or PIMs and allowed for deeper interviews by probing terms or meanings that may be taken for granted by a researcher with clinical experience. The participants did not know the researcher prior to the study and the researcher explained he was completing this study at the request of the ABP but employed by the UF. Questions addressed satisfaction and dissatisfaction with MOC and PIMs, the process of choosing a PIM, and clinic-based supports, facilitators and barriers to completing PIMs. The interviewer also confirmed participants’ demographic information and practice characteristics. Each interview lasted 30 min or less, was audiorecorded, deidentified and transcribed.
Supplemental material
The resulting data were analysed from a pragmatic interpretivist theoretical framework18 using thematic analysis and followed an iterative, team-based approach using constant comparison methodology.19 For analysis, Dr. Hendricks worked with Dr. Theis, a male PhD working as an assistant professor in qualitative health services research, to independently review and code transcripts and develop a codebook using a spreadsheet. After independently coding the first five interviews using line-by-line initial coding, the two authors worked together to develop a codebook, which was then transferred into Dedoose, a qualitative coding software, for further coding.20 The remaining interviews were divided up with each researcher coding half while meeting regularly to review code applications, adjust the code book and resolve discrepancies. All coding was completed using Dedoose.20 A clinical investigator (CR) reviewed all coding for consensus and to review for any missing meaning related to clinical terms. The coders did not identify any new codes after the ninth interview and the codebook continued to be refined until the fourteenth. After completing all coding, the researchers organised the codes into themes, and a used framework analysis of cross-tabulated findings by theme and participant characteristics to identify the most salient themes and permit an in-depth exploration of variation in provider experiences and perceptions.21
Patient and public involvement
Patients or the public were not involved in the design, conduct or reporting of this research.
Results
Participant characteristics
Table 1 shows demographic variables for all paediatricians who either completed a PIM or had one in progress. Statistical analyses of completion status did not reveal differences by gender, last training year, medical school graduate type, MOC status or subspecialty status, and thus, we performed no further quantitative analyses. While many of the variables included too much missing data to be viable, the lack of any association suggested a more complex picture than we anticipated.
Among the 21 paediatricians who participated in qualitative interviews, the majority were female (71%) and non-Hispanic white (57%). Most study quotas were represented, capturing diverse perspectives from those in academic (52%) and non-academic settings (48%), generalists (57%) and subspecialists (38%) and those in practice for ≤10 years (52%) or 11+ years (48%). The majority of participants had completed a PIM (86%) and preclassified themselves as ‘not-satisfied’ (62%).
Thematic analysis
A thematic analysis of the interview transcripts revealed emerging themes in three broad categories: (1) factors related to MOC and PIM satisfaction and dissatisfaction, (2) barriers to completing PIMs and conducting meaningful QI work as part of MOC and (3) recommendations for making PIMs more accessible and useful.
Physician satisfaction/dissatisfaction with MOC part 4 and PIMs
Some participants expressed satisfaction with the QI components of part 4. Key themes related to satisfaction included convenience of QI and PIMs and positive impact of QI projects. Four participants attributed satisfaction to the convenience of QI and PIMs. One participant considered part 4 to be ‘the best one of all’ because it was quick, easy, and useful for residents and nurses, and ‘resource-light’ (ID 1, generalist, academic, FQHC). Another remarked on the convenience of having a consolidated electronic database for documenting PIMs, adding that completing PIMs ‘is worth the time’ (ID 15, generalist, non-academic, private practice).
Satisfaction was also reported in two cases where PIMs effected positive change in clinical practice or outcomes, One participant reported that the motivational interviewing module was effective, stating, ‘It really empowered the teenagers’. (ID 2, generalist, non-academic, group practice). Another noted that QI encourages documentation and evaluation of practice and improves awareness of providers and staff, stating that structured processes such as PIMs help clinicians remember to follow clinical practice guidelines. One academic specialist paediatrician, whose PIM follow-up survey response was coded as ‘dissatisfied’, nonetheless reported that selecting PIM topics relevant to her practice increased awareness of staff and was good for patients (ID 6, specialist, academic, children’s hospital).
In contrast, themes related to dissatisfaction with part 4 were more common, including reference to the ineffective or negative impact of QI efforts, the burden of completing MOC part 4, and monetary costs of existing QI activities. Table 2 provides the most salient dissatisfaction codes and definitions used to identify these themes. In many cases, participant reports of dissatisfaction did not distinguish between PIMs specifically and the MOC process in general.
Ten participants reported that QI efforts and PIMs had no positive impact on clinical practice or patient outcomes. Some stated that PIMs provided no learning opportunities for clinicians and staff. Others attributed the ineffectiveness of QI with PIM topics that were not relevant to their practice or did not address a real problem. As one stated: ‘I can see the utility of having to actually review your information… but in terms of the PIMs, maybe I just didn't pick ones that would challenge the situation.’ (ID 3, generalist, non-academic, FQHC)
Eight participants remarked on the burden of conducting QI, describing PIM activities as ‘busy work’. One described the effort of completing a PIM as difficult to balance with clinical activities and workflows, stating: ‘We got to keep moving and see our patients, and we can't really stop for an extra survey, or an extra this, or an extra that.’ (ID 3, generalist, non-academic, FQHC).
Three participants expressed reservations with the cost of accessing and completing PIM modules. Notably, one participant described both cost and burden as factors of dissatisfaction:
…I agree with the general principle, but I think the execution is poor….it’s very expensive… it’s excessively time consuming… it’s poorly relevant. I think it doesn't allow for physicians to claim for the work that they're already doing, and instead it creates busy work, which takes us away from our lives outside of medicine. (ID 21, specialist, academic, children’s hospital)
Specific barriers to completing PIMs and conducting meaningful QI
Participants disclosed important barriers to completing PIMs, with perspectives varying between academic and non-academic providers (online supplemental material 3). These barriers echoed the dissatisfaction themes described above, including: (1) the excessive effort to conduct PIMs in the context of a busy clinic; (2) the lack of benefits of PIMs for patients, practices and professional development; and (3) the lack of relevance of PIM topics to clinical practice, providers or organisations.
Supplemental material
Half of the participants cited excessive effort needed to conduct PIMs as a barrier, including documentation requirements, data entry and uploading data. The time required to conduct PIMs conflicted with patient care priorities and providers’ work-life balance. As one general paediatrician stated:
Most of the projects are pretty time-consuming… It’s not something that I really feel passionate about… I want to have time at home with my kids. (ID 17, generalist, academic, children’s hospital)
Eight participants noted that PIMs did not result in tangible improvements in their clinical practices. Some reported not taking PIMs seriously, which was evident in reported behaviours such as rushing through the activities or ignoring accompanying online discussions. Three participants reported witnessing others fabricate PIM data, or fabricating data themselves, reflecting their beliefs that the activity offered no benefit. As one participant noted about the breastfeeding PIM:
I did not actually gain anything from it. I just, to be honest, did it for the credit… I didn’t collect data from my own office. I just kind of made up the data. (ID 16, generalist, non-academic, group practice)
Some participants described barriers that are associated with QI practices generally, such as having baseline performance on quality metrics that are ‘too high’ to improve on, or lacking a formal structure for QI support or coordination. One academic general paediatrician, who noted her clinic had no support for QI activities, remarked that the hand-washing PIM had little impact on her practice, stating:
It wasn't something I needed to improve on. I mean I had a 99% accuracy rate of washing my hands… I mean, the whole thing was just kind of a waste. (ID 17, generalist, academic, children’s hospital)
One general paediatrician commented specifically on the PIM handwashing module, describing it as ‘fake’ because it generated no benefit:
It just felt like the yield you got out of it is low… You have to pay the ABP to [do] this fake study on hand washing. Their ability at the ABP to certify that we actually did an improvement project, and that there was… improvement to practice. It’s like marginal. Maybe there’s a few people, but most people feel like, ‘oh it’s a hoop!’ Or they hope their hospital or clinic can throw together one on an existing project. (ID 4, generalist, academic, children’s hospital)
Some academic paediatricians described the redundancy of PIM topics with QI they were already doing. Conversely, some providers in non-academic settings selected PIM topics because they were redundant with their clinical practice and would therefore be easy to complete. One non-academic general paediatrician described PIMs as tedious and a ‘complete waste of time,’ commenting:
Hand washing is a quick one. The breastfeeding one is a quick one… I just had people fill out a little paper and then entered it. But nothing changed really in our practice because we do that anyway. (ID 9, generalist, non-academic, group practice)
Recommendations to make MOC QI and PIMs more accessible and useful
Participants’ perspectives yielded recommendations for improving PIMs divided into three themes. Participants encouraged the ABP to (1) simplify the process of completing PIMs; (2) improve the relevance of PIMs and (3) improve the quality of PIMs to ensure real change. To streamline the process, participants emphasised the importance of simplifying steps to obtain credit for their own QI, such as reducing the required paperwork, making the submission more structured, and addressing the barrier of cost. Other participants suggested technical solutions, such as consolidating PIMs into single system, improving how to submit data to ABP, and adapting PIMs to clinical EHR systems. One hospital-based subspecialist highlighted strategies such as reducing ‘busy work’ and eliminating the ‘high stakes’ that connect PIMs with certification, suggesting that providers should be allowed to ‘self-claim in a more streamlined fashion’.
Recommendations to improve the relevance of PIMs for clinical practice included ensuring that PIMs acknowledge the assets and limitations of specific clinics, offering more choices for PIM topics and making topics adaptable to providers in different career stages, specialties and practice settings. One general paediatrician from an FQHC noted that PIMs could be made more relevant by engaging providers in these settings:
The people generating these PIMs should actually be part of an FQHC… Engage the providers that are working in FQHCs to develop projects that would, obviously, work within the workflow and the patient population, and reward them for it in some way, shape, or form. (ID 3, generalist, non-academic, FQHC]
Some participants offered solutions to improve the quality of PIMs, such as improving the reliability of comparative data for QI projects, aligning PIM topics with current clinical practice guidelines, and presenting QI conducted in academic settings as the ‘gold standard’ for how PIMs should be designed. As one academic subspecialist noted, PIMs should be designed to help paediatricians ‘think through management of the patient’ rather than simply ‘check boxes’.
Discussion
Paediatricians who maintain continuous certification with the ABP voiced criticisms similar to those expressed in the past, articulating the tension between personal autonomy and participation in activities that demonstrate maintenance of professional standards.1 6 Focusing on PIMs, findings from this analysis noted several key themes for moving forward with the aim of improving the quality-of-care children receive through lifelong licensure. First, the demographics of paediatricians did not predict PIM completion nor satisfaction with the part 4 QI process, reducing the need for the ABP to implement group-specific outreach in the future. Second, these findings show that maintaining PIMs for paediatricians as an option for completing the MOC part 4 requirement is important, as many paediatricians do not practice in higher-resource settings where using their own QI topics is feasible. This will require improving their ease, delivery and design. Many paediatricians perceived PIMs to be time consuming and burdensome, reinforcing that they produced no real benefit to quality or their practice.
Finally, this study demonstrates ongoing dissatisfaction with QI globally, even after a decade of QI embedded within MOC and despite literature supporting positive changes by both physicians and families.10 11 13 16 22–24 While the ABP needs to improve the rigour of PIMs as tools for online continuing professional development, this persistent dissatisfaction reflects intrinsic individual beliefs and extrinsic values of healthcare systems about the value of QI work.25 This needs to be addressed for any QI efforts to succeed, particularly if individuals are not perceiving its value and choose instead to fabricate results. Prior research suggests that QI and MOC can be seen as valuable and relevant when organised and implemented around organisational priorities.2 The recommendations that paediatricians offered for improving part 4 and PIMs are valuable and demonstrate the need for enhanced and engaging modules above what is currently available, as well as novel approaches to unify the aim of improving quality with the delivery and measurement of ongoing efforts to improve quality. Given the current demands for high-quality, online educational opportunities, this analysis provides timely and relevant direction for how the ABP could proceed.
A review of the literature on MOC did not discern any significant personal or practice characteristics that may affect experiences with MOC part 4. In our study, physician and practice characteristics did not yield differing perspectives except in the case of academic versus non-academic settings. In contrast to smaller, resource-poor practices operating in non-academic settings, the resources available to paediatricians in academic centres can help to facilitate QI. However, participant paediatricians operating in higher-resource hospital settings (both academic and non-academic) encounter different types of challenges, including institutional barriers to fulfilling PIMs, reflecting a need for local institutional change to prioritise those system changes that make improvement work easier within practice settings (eg, access to performance data). Changing the culture of QI for individuals and for institutions remains central for any PIM or other educational opportunity to succeed.
Concurrent to this study, the commission of the ABMS released Continuing Board Certification: Vision for the Future Commission. The report articulated the difficulty in widespread adoption of part 4 MOC but continued to recommend data-driven advances in clinical practice as a core component of continuing certification programmes.26 It is important that ABMS Boards embrace the challenge to improve their current MOC part 4 programmes. Doing so can maximise meaningful physician engagement and provide the best opportunity to drive improvements in care. Therefore, improving physician satisfaction with the certification-dependent QI process by providing relevant QI opportunities, such as PIMS, remains a central task. PIMs must also embrace new formats for online education to engage learners of all technical aptitudes and interest levels, such as supported by Quality Matters.27 In addition, PIMs will need to address high yield, relevant topics and engage physicians by ensuring that those with varying levels of QI aptitude can participate in a meaningful way. Finally, newer formats provide the opportunity for ethical attestations, affirming professionalism in theory and in practice. This analysis serves as a reminder that complex learning, such as that required to promote, support and effect change using QI methods, provokes strong responses for paediatricians in ways that do not cleanly fall within classifications according to setting, specialty or years in practice.
This analysis has several limitations. First, despite a careful sampling strategy based on assumptions of professional differences within paediatrics, the qualitative interviews still may have not achieved a representative set of opinions. Missing demographic and practice information in the administrative data, for example, may have resulted in the under-representation or over-representation of certain opinions. Second, our findings may reflect a bias towards both very satisfied and very dissatisfied paediatricians, since those with stronger views may have chosen to participate at higher rates. Further, we only interviewed diplomates who completed PIMs, and do not have those opinions of paediatricians who avoided PIMs altogether. Finally, since this study was performed, there was an initial decrease in PIM usage according to 2019 ABP data, perhaps representing an increase in QI knowledge and use of other formats for achieving MOC part 4 credit. However, 2020–2021 may yield different usage data, reinforcing these online modules as opportunities for online content to facilitate change.
Conclusion
In conclusion, this analysis of PIMs articulates areas of concern for paediatricians, and highlights the need to continuously improve care for children through the delivery and measurement of QI efforts, including the delivery and format of PIMs. At their inception, PIMs were designed to be a temporary introduction to QI, aimed at then-advancing knowledge to allow paediatricians to do their own QI. Yet, given the unprecedented issues the coronavirus pandemic revealed, online learning remains a key opportunity for long-term paediatrician engagement. Paediatricians in resource-poor practices may benefit from QI navigation or coordination for projects that are immediately relevant to their practices, giving them the opportunity to perform QI projects that align their positive theoretical views of QI with their actual experience of QI in practice. With support from the overarching ABMS recent report,26 all boards, including the ABP, will need to fortify their efforts, both online via products like these PIMS and via practice implementation of QI, to globally solidify QI as a key process for improvement while balancing the poignant recommendations of participating paediatricians on the specifics of measuring the part 4 QI efforts available for MOC.
Data availability statement
Data are available on reasonable request.
Ethics statements
Patient consent for publication
Ethics approval
This study involves human participants and was approved by University of Florida IRB-01IRB201800751. Participants gave informed consent to participate in the study before taking part.
Acknowledgments
All phases of this research were supported by the ABP Foundation; PI: Lindsay Thompson. The sponsor advised on research design, facilitated recruitment, reviewed results and manuscript, and engaged in discussions concerning journal submission.
References
Supplementary materials
Supplementary Data
This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.
Footnotes
JJH and RT are joint first authors.
Contributors JJH recruited and interviewed participants; developed interview tools and sampling scheme; worked with coauthor to develop the analysis plan, creation of codes, and codebook; wrote and revised sections of the manuscript; approved the final manuscript as submitted; and accepts full responsibility for the work as guarantor. RT conceptualised the study, developed interview tools and sampling scheme; worked with coauthor to develop the analysis plan, creation of codes and codebook; wrote and revised sections of the manuscript, and approved the final manuscript as submitted. KJM, LKL, EB, CR and AB, ALT and SLF aided in the conceptualisation of the study, made substantial contributions to the analysis and interpretation of the data, critically reviewed the manuscript drawing on the existing literature, and approved the final manuscript as submitted. LAT conceptualised the study, developed interview tools, helped draft the initial manuscript, and approved the final manuscript as submitted. All authors agreed to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.
Funding American Board of Pediatrics (Thompson, PI), 5/1/2018-11/1/2019, A Mixed Method Evaluation of the ABP PIMs, No award/ grant number.
Disclaimer The content is solely the responsibility of the authors and does not necessarily represent the views of the ABP or the ABP Foundation.
Competing interests JJH, RT, CR, and AB and SLF have no potential conflicts of interest to disclose. KJM and LKL as well as ALT are employees of the American Board of Pediatrics (ABP); LAT volunteers as Chair of the Continuing Certification Committee of the ABP. DB volunteers as a member of the ABP Telehealth EPA drafting committee and serves as a strategic consultant to the Association of American Medical Colleges.
Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.
Provenance and peer review Not commissioned; externally peer reviewed.
Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.