How, why and under what circumstances does a quality improvement collaborative build knowledge and skills in clinicians working with people with dementia? A realist informed process evaluation
•,,,,,,,,,,,.
...
Abstract
In increasingly constrained health and aged care services, strategies are needed to improve quality and translate evidence into practice. In dementia care, recent failures in quality and safety have led the WHO to prioritise the translation of known evidence into practice. While quality improvement collaboratives have been widely used in healthcare, there are few examples in dementia care.
We describe a recent quality improvement collaborative to improve dementia care across Australia and assess the implementation outcomes of acceptability and feasibility of this strategy to translate known evidence into practice. A realist-informed process evaluation was used to analyse how, why and under what circumstances a quality improvement collaborative built knowledge and skills in clinicians working in dementia care.
This realist-informed process evaluation developed, tested and refined the programme theory of a quality improvement collaborative. Data were collected pre-intervention and post-intervention using surveys and interviews with participants (n=28). A combined inductive and deductive data analysis process integrated three frameworks to examine the context and mechanisms of knowledge and skill building in participant clinicians.
A refined program theory showed how and why clinicians built knowledge and skills in quality improvement in dementia care. Six mechanisms were identified: motivation, accountability, identity, collective learning, credibility and reflective practice. These mechanisms, in combination, operated to overcome constraints, role boundaries and pessimism about improved practice in dementia care.
A quality improvement collaborative designed for clinicians in different contexts and roles was acceptable and feasible in building knowledge, skills and confidence of clinicians to improve dementia care. Supportive reflective practice and a credible, flexible and collaborative process optimised quality improvement knowledge and skills in clinicians working with people with dementia.
Trial registration number
ACTRN12618000268246.
Background
The challenge of implementing evidence-based guidelines into clinical practice continues to be of concern in healthcare.1 In dementia care, recent Organisation for Economic Co-Operation and Development (OECD) reports show poor care and low training persists in member countries.2 In Australia, serious failures in dementia care3 have prompted inquiries into safety and quality.4 Despite evidence of post-diagnostic interventions improving quality of life, scepticism exists about the ability of people with dementia to benefit, resulting in lower uptake of evidence.5–7 In this context, the WHO Global action plan on the public health response to dementia 2017–20258 identified as a priority, the need to translate known evidence into practice. In the complex field of dementia care, understanding what strategies work9 to overcome barriers to implementation is key to improve the quality of care for people living with dementia.
One approach widely used to implement evidence-based practices is the quality improvement collaborative (QIC).10 This approach developed by the Institute for Healthcare Improvement11 involves bringing together health professionals to learn and share methods to improve care. Key elements include a focus on a specific topic of healthcare, participants from multiple sites, clinical and quality improvement experts to provide advice and guidance to participants, structured activities to identify and try out improvements over time and monitoring of progress against the aims of the improvement.12 13 Despite their appeal in improving healthcare, high set up costs and varied success limit confidence in their use.10 14 15 There are few examples of QIC applications in dementia care.16–18
Recent studies of QICs have described components,12 19 reported on evaluations,20 21 effectiveness10 and cost-effectiveness,15 and identified factors influencing outcomes.22–24 Researchers identified the need to open the ‘black box’ of QICs to understand how components contribute to success.13 25 26 A theory-based understanding of the QIC process is advocated to better understand the influence of context on outcomes.27 28 Understanding how and why QICs work under different circumstances is critical to assess suitability and justify the approach. Complex interventions such as QICs are multicomponent processes that interact with each other and the external and organisational contexts in which they operate.29 Linking a theoretical approach to an evaluation framework helps better understand how to design implementation interventions and evaluate how they work. Realist approaches have been used to understand how QICs work28 and several studies have reported realist evaluations of process and outcomes.30–32 Few studies have used a realist approach33 or explored the use of QICs to improve quality of dementia care.34 Realist evaluation35 provides methods to understand how clinicians build knowledge and skills to improve dementia care in diverse settings.36
Methods
Aim
This realist-informed process evaluation aimed to improve our understanding of how a trial QIC worked to implement evidence-based guidelines into practice in dementia care. The approach uses realist evaluation methods to focus on how feasible and acceptable the QIC was at the trial stage.30 37 38 A full realist evaluation would have considered the effect on guideline adherence.37 39 This component of the evaluation has been reported separately by Laver et al.40 Research questions are:
How, why and under what circumstances do QICs build knowledge and skills in clinicians to improve quality and practice?
Was the QIC approach acceptable and feasible to clinicians?
The process evaluation was embedded within a translational research trial (referred to as ‘Agents of Change’) which examined whether QICs could improve adherence to several recommendations in the Australian Clinical Practice Guidelines and Principles of Care for People with Dementia (referred to as the guidelines hereafter).41 Full methods for the trial have been published in a protocol paper.41 The effect of the QIC on the outcome of guideline adherence was measured using an interrupted time series design and results were reported recently.40 Clinicians responded to advertisements for the collaborative and self-selected to join one of three subgroups within the collaborative related to implementation of exercise, carer support or occupational therapy recommendations. A light touch, low cost intervention was trialled. This included online learning modules, teleconference meetings and email communication to reduce time and costs of participation. Local adaptation of the guideline recommendations was encouraged. Online supplemental file 1 summarises the components involved in this QIC.
Patient and public involvement
Experts-by-experience of dementia (people with dementia and caregivers) were involved in developing priorities and advising researchers and clinicians throughout the trial of the QIC. One expert-by-experience of dementia was an investigator in the trial and was involved in the conduct and monitoring the evaluation. She provided advice and comments on the text and with other experts-by-experience of dementia advised on the form of acknowledgement of their work. Results of the main trial and from this and other studies have been sent in email newsletters to all participants and co-researchers with a link to the published papers. Experts-by-experience of dementia and coresearchers have presented results at national and international conferences. An evaluation of their impact will be reported separately.
Study design
This process evaluation followed guidance on process evaluation37 42 and realist evaluation35 in knowledge translation interventions.43 It addressed implementation outcomes of acceptability and feasibility of the trial of the QIC approach in building skills and knowledge of participating clinicians.30 39 Outcomes of fidelity, penetration and uptake of the clinical guidelines for dementia care as described in the protocol paper41 were reported recently.40
The study was completed in four phases:
Phase 1: development of the initial programme logic and programme theory to be confirmed with the research team.
This involved: (1) describing the strategy and logic of the programme, (2) hypothesising a programme theory (3) proposing underlying mechanisms (M) to achieve the implementation outcomes (O).37 This is denoted as context (C), mechanism (M) and outcome (O) configurations35 to enable understanding of the relationship between these programme aspects.
The initial programme theory was developed through iterative searches of grey literature and academic databases for theory components as recommended by Booth et al.44 Terms used were collaborative learning, quality improvement, skills and knowledge, guideline implementation and QIC in healthcare.29 44 The multiple components of the QIC method45 were explored by reference to Institute for Healthcare Improvement11 and The Health Foundation reports,46 47 then initial theory components were identified.48 A limited stakeholder (trial research team) consultation developed ‘If…then’ statements (figure 1A), to be tested with clinicians at post-intervention stage.
Initial and refined programme theory of a Quality Improvement Collaborative in agents of change trial.
Phase 2: pre-intervention and post-intervention data collection of surveys and interviews (quant +QUAL)
A concurrent mixed-methods approach49 was used to develop an understanding of the clinicians’ experience in the QIC. The survey tool Quality Improvement Knowledge Assessment tool (QIKAT-R)50 identified changes in their level of knowledge of quality improvement over two time points. The Normalization MeAsure Development (NoMAD) survey51 was used to identify clinicians’ understanding of processes to normalise changes in practice on commencement and at completion of the programme. Pre-intervention interview questions were developed to explore context using the Consolidated Framework for Implementation Research (CFIR),52 the processes of implementation using normalisation process theory (NPT)53 and expectations of the QIC using programme theory.35 At post-intervention stage, questions related to clinicians’ experience, barriers and enablers, and achievements. The initial programme theory was then shared with clinicians to explore their reasoning and response to the hypotheses.
Phase 3: data analysis, patterns of mechanisms and hypothesis testing
A framework analysis of interview data identified change in knowledge and skills, multilevel contextual influences and clinicians’ experience of the collaborative. Exit interviews were conducted with clinicians who withdrew from the collaborative to provide an understanding of their reasons for withdrawal. Summaries of patterns of mechanisms are presented for three major settings in which clinicians worked. The interviews described acceptability and feasibility of the trial of the QIC and how learning generated change.54 The survey data were integrated with the patterns of mechanisms to test the programme theory.
Phase 4: refinement of the initial programme theory
First, the initial programme theory was shared with clinicians in the post-intervention interviews. The ‘if…then’ propositions were discussed to assess if and how each applied. Clinicians offered their own rationale for each proposition, some refuting, some confirming and most refining the propositions. Second, these responses were compared with the pre-intervention results to identify where they differed and to revise the theory of how and why the QIC worked and in what circumstances.48 Third, survey results were integrated with interview results to identify patterns of mechanisms and differences between three main setting types. Where patterns matched hypotheses the programme theory was confirmed. Where data did not match, the hypothesis was refuted and where additional conditions were identified, the programme theory was revised to improve understanding of how the trial QIC built knowledge and skills for clinicians.43
Data collection
Surveys
Quantitative data were collected in two surveys, using the QIKAT-R50 and the NPT measure (NoMAD)51 administered pre-participation and post-participation in the collaborative. QIKAT-R is designed to assess clinicians’ ability to write an aim, a measure and change for a quality improvement scenario.50 NoMAD51 assessed the degree of agreement of clinicians with statements based on the four NPT53 constructs of normalising a change to practice.
Clinicians consented to participate in the evaluation and undertook the surveys online in the introductory and final learning modules. Data were extracted for analysis of changes in understanding. Online supplemental file 2 provides an outline of the interview questions and online supplemental files 3 and 4 provide an example of the NoMAD and QIKAT-R surveys used in the online learning modules. On completion of the programme clinicians were asked to comment online on their degree of success in implementing change.
Interviews
Clinicians were invited to participate in interviews and were introduced to the evaluator via an email from the project coordinator (MCa). The first author (LdlP) undertook the evaluation as a PhD student with experience as a clinician in aged care and sought consent via the approved ethics process. Semi-structured private telephone interviews, up to an hour, were conducted by LdlP with clinicians, on commencement and completion of the programme. The same interview guide was used for each person to describe their motivations, experiences, setting and role. A realist interviewing approach using a supplemented interview guide was added at the post-intervention stage to share the initial programme theory and understand their reasoning and responses.55 With consent, interviews were recorded and transcribed, checked for accuracy and sent to clinicians for comment or addition. Field notes made by LdlP during the interviews added information for accuracy, emphasis or requests to stop recording of parts of the interview.
Data analysis
Surveys
Responses were extracted from the online surveys, de-identified for each clinician and compared with identify change in knowledge and skills of quality improvement (QIKAT-R) and engagement in processes of normalising implementation (NoMAD). Results were scored (by LdlP and GR) for QIKAT-R50 using the rubric provided. The principal researcher (KL) resolved any discrepancies. The NoMAD51 survey responses were converted to a five-point Likert scale56 (by LdlP and checked by GR). Descriptive statistics were used to present the degree of agreement with implementation processes by clinicians. Small sample sizes, missing data and lack of controls limited further statistical analysis.
Interviews
Interview data were transcribed verbatim, de-identified and entered into NVivo V.12 software,57 for analysis using a combined inductive and deductive58 framework analysis approach.59 Three frameworks were used to identify: issues related to the context (CFIR),52 the social processes involved in normalising the change (NPT)53 and the mechanisms at work within the collaboratives and the broader context (RE).35 These frameworks provided additional insight into context (C), mechanism (M), and trial outcomes (O) to understand how, why and in what circumstances the collaborative may work. A recent model for synthesising multilevel data in implementation research has similar approaches.60Table 1 shows the alignment of these frameworks.
Table 1
|
Alignment of frameworks for analysis of qualitative data
Coding categories were developed from the frameworks and interviews were coded deductively (LdlP) with 30% checked for consistency (GR). Any differences were resolved by discussion or consultation with the principal researcher (KL). Elements of context, mechanism and outcomes patterns were searched for in the data through a deliberate and inductive process.30 Quotes from interviews were extracted and presented in the results. This adapted framework analysis was used to confirm, refute or revise the initial programme theory.61
Integration of results
Data from interviews and surveys were integrated at both the pre-intervention and post-intervention stages through description and joint display62 to identify where they confirmed, refuted or revised the initial programme theory. A revised programme theory was developed to explain how and why the collaborative built knowledge and skills in quality improvement.
Results
Participants
Of the 45 clinicians in the Agents of Change trial, 28 (62%) were involved in the process evaluation.
The QIKAT-R was completed by 26 (58%) clinicians at pre-intervention and 18 (38%) at post-intervention. The NoMAD survey was completed by 13 (29%) clinicians at pre-intervention and 15 (33%) post-intervention. Table 2 presents the characteristics of clinicians, showing the range of professions, settings, locations, type of organisation as well as the subgroup chosen for the collaborative.
Table 2
|
Characteristics of participant clinicians by collaborative subgroup in the process evaluation
Pre-intervention and post-intervention survey results
Results for pre-interventionand post-intervention surveys are presented in online supplemental file 5 (NoMAD) and online supplemental file 6 (QIKAT-R). Most clinicians (80%) scored poorly on the QIKAT-R prior to the intervention, demonstrating limited knowledge about quality improvement. This finding validated the need for learning. These scores improved modestly post-implementation. In the pre-intervention NoMAD survey, most clinicians (70%) saw the need for change, and how the guidelines differed from their current practice. They were optimistic about the support they would have from managers and the collaborative but were less confident in their coworker abilities to implement the changes. Post-implementation, most indicated decreased support from their managers and increased confidence in coworkers.
Pre-intervention interview results
Interviews were conducted with 24 (53%) clinicians. They reported feeling highly motivated to undertake the process and participate in the collaborative subgroups. Over 85% reported having no experience of leading quality improvement processes and were unsure of their knowledge or how the implementation process would unfold in their setting
Context
Most clinicians identified external policy and funding constraints on their organisations which would impact on their practice. This was reflected in changes to their roles, restructuring of the organisation and time constraints.
We're going through a major…change with the new CEO…challenge for me is that because staff are unhappy, we are having a high turnover (participant S11).
In public hospital services, multidisciplinary teams and formalised quality improvement structures were identified as being supportive of the proposed changes. In aged care settings, however, most participants identified role boundaries and scepticism as barriers to quality improvement.
No doubt there'll be a bit of resistance from … staff, … ‘why should I do it your way, I've done it this way my… entire life?’ (participant E11).
What we do is treat pain or put things into place to prevent falls, but that’s all you can do (participant E10).
Wait and see approach
Most clinicians were unaware of the existence of the guidelines before commencement and were uncertain of how recommendations could be adapted to their practice. While most understood how the trial QIC would work, they were cautious about what would be required in their setting.
Implementation processes
Clinicians in all settings understood the intent to adapt implementation of the guidelines to suit their setting and expressed confidence that this approach was acceptable and feasible.
I feel fairly confident that we will be able to get things off the ground and make some changes (participant S15).
Those working in hospital settings were more likely to have experience of quality improvement and had begun to identify who they needed to involve in the change process.
…needs to go through my director and the …reference group …so any reporting back on any changes in process or procedure …would have to be verified … and approved (participant C06).
Most clinicians understood that implementing changes would involve communicating with others and engaging them in new practices.
Mechanisms
The mechanisms identified from interviews were similar but described differently by clinicians in each of the three main settings. Table 3 presents initial mechanisms identified across three settings for participants.
Table 3
|
Initial mechanisms identified across three key settings, with example quotations
Post-intervention interview results
On completion of the QIC, interviews were conducted with 16 (36%) clinicians. Most reported their acceptance of and satisfaction with the process. They identified significant skills and understanding gained from the process.
A solid methodology and a solid quality improvement plan have been really critical in getting us to a point where it’s working and sustainable (participant S13).
They reported how the process enabled them to review their own practice.
…quite a bit of reading and reflection that was involved in the project, especially when you’re going through that PDSA cycle (participant C05).
However, they reported that overcoming preconceived ideas was demanding.
there was a bit of education with the staff … at that particular site had some preconceived notions about whether people could attend or not (participant E13).
…people go, ‘Oh no, I haven’t really had anyone that’s suitable.’ And I say ‘Well, who have you had?’ (participant E09).
For those who were successful in making change, the support of managers and involvement of others were key to implementing the guidelines. Clinicians in aged care and public hospital settings reflected on the team effort.
…it was really a team effort at the end of the day. (participant S13).
…identifying your local heroes and putting responsibility on other people …, ‘this isn’t just me doing this. This is us doing this’ (participant E11).
Others were able to align the improvements with organisational strategies and structures and gain support from others. In hospital settings with quality improvement structures, this alignment made the process feasible and provided both accountability and recognition.
… it crosses over many of the domains from the organisational point of view and accountability…. It’s been great to have that recognised (participant C06).
External and internal context changes led six clinicians (13%) to withdraw from the programme. Two had personal family circumstances that led them to leave their work and participation in the programme. Others were related directly to organisational changes.
Funding changes at a national level resulted in significant organisational and role changes and stress for two clinicians in aged care settings.
…the sector is facing quite dramatic reform…our focus upon managing dementia in the community, may not be a priority going forward (participant S06).
That led to changes in the level of support available from their managers.
…the support from management is very limited because their energy is all being focused on the (organisational) change itself… (participant S02).
In public hospitals, time constraints impacted on the level of inter-disciplinary team support, with one clinician withdrawing due to tensions in the team.
the dynamics were more difficult than I had anticipated, and making any change was going to alienate me (participant O08).
Those who withdrew were disappointed to leave, but valued the learning modules, access to peers and research team support.
Mechanisms at work within the collaborative
The initial programme theory was shared with clinicians in the interviews to consider and reflect on their experiences. The ‘if…then’ propositions were presented to clinicians to assess if and how each applied to them. The mechanisms identified on commencement were generally supported and some were modified on reflection. The structured process of the collaborative provided confidence while a sense of accountability to complete the programme drove commitment to the changes. The collaborative provided a sense of community and confidence in the process. The credibility of the evidence base and the team of experts and researchers engendered trust and confidence to make changes. An additional mechanism was identified of how achievements were recognised through reflection. Table 4 summarises the mechanisms and reasoning identified.
Table 4
|
Summary of mechanisms identified by clinicians at the conclusion of the programme, with example quotations
Integration of results
Post-intervention results were integrated and compared with the pre-intervention results to identify where they confirmed, refuted or suggested the programme theory needed revision.
While results from the QIKAT-R survey showed modest improvement in knowledge of quality improvement methods, data from interviews provided examples from clinicians across settings that they gained knowledge and skills in quality improvement.
The results from the NOMAD survey confirmed that clinicians were engaged with the changes and made efforts to involve others in implementing changes. All clinicians agreed that audit reports and feedback on implementation plans helped them to modify practice and deliver changes. The interview data confirm the value of reflective practice to clinicians to consider gaps and to monitor progress in changes.
Table 5 presents a summary of how the qualitative and quantitative results aligned to confirm, refute or lead to revisions of the programme theory.
Table 5
|
Integration of main findings and alignment with programme theory
A refined programme theory was developed and is presented in figure 1B. Support through the QIC built confidence (mechanism) for most clinicians to make changes (outcome) despite constraints and scepticism (context). When support was lacking in their setting, those constraints led some to withdraw or only partially complete the implementation. The credibility of the experts (context) encouraged trust in the process and the confidence (mechanism) of clinicians to commit to improving dementia care (outcome). Review processes (context) enabled reflection and recognition of efforts (mechanism) in improving dementia care (outcome).
Discussion
A realist informed approach provided insights into how, why and under what circumstances a trial QIC built knowledge and skills in clinicians working in dementia care. The QIC attracted clinicians with a passion to improve dementia care in a context of resource constraints and pessimism about the benefits of interventions to improve quality of life. Devi and colleagues have identified how the multiple types of staff, the prescribed roles and differences about priorities, and negative perceptions around care homes, impacted on the use of QICs in UK.63 Similar contextual influences were seen across services providing dementia care in this study. The QIC provided resources and opportunities for clinicians that were not usually available in their setting and met their needs for support, coaching, practice reflection and a flexible structure. They valued the credibility of the programme, the flexible approach which suited their work needs and the process of trying out changes before adopting a new practice. By being part of a dementia specific collaborative with access to experts and peers for support and advice, they developed confidence to pursue change in practice. Access to experts-by-experience of dementia and clinical experts convinced clinicians of the benefits and empowered them to challenge preconceived ideas and routine practice. When their personal motivation aligned with organisational structures and resources, clinicians successfully built the knowledge and skills to implement significant systems improvements and were recognised for their achievements.
Others were able to change their practice for the selected recommendations of the guidelines and reported improvements for their clients. Many faced contextual barriers through time and resource constraints, manager or team resistance, major organisational restructures, and policy changes. While some clinicians withdrew due to contextual barriers, most gained knowledge, skills and the confidence to engage in quality improvement which improved practice in their setting. There was a sense of empowerment for many clinicians in overcoming barriers to change. Six mechanisms in the QIC were identified: motivation, accountability, identity, collective learning, credibility and reflective practice. The relationships between context, mechanisms and outcomes showed how components of the QIC worked to build a sense of identity and confidence to challenge preconceived ideas of what was beneficial for people with dementia. The flexible, on-line delivery and guidance of the QIC programme made the process acceptable and feasible for most clinicians.
While QICs have been studied extensively, implementation has differed and outcomes have been inconsistent.10 64 Few studies have used a realist approach33 or explored the use of collaboratives to improve quality in dementia care.34 Applying a theory-based evaluation to understand how and why a QIC built knowledge and skills in clinicians, is key to capacity building65 and identifying strategies for knowledge translation efforts in dementia care.
By bringing clinicians together from different care settings, who work with people with different severity of dementia, the QIC provided opportunities to work on a range of quality improvement activities to suit their settings. This study advances the understanding of how components of QICs contribute to success and why they matter to clinicians in dementia care. It offers an understanding of how support from peers and experts, and reflective practice in collaboratives worked specifically in dementia care, where clinicians are often isolated, there is pessimism about potential gains of interventions for people with dementia and resistance from coworkers. Adherence to the three key guidelines was sustained over the 9 months of the trial as reported separately.40 The online modules for learning have been made publicly available for clinicians to use, and with increased uptake will be monitored regularly. The findings offer insights to inform the design of future QICs to further spread clinical guidelines for dementia.
Evaluation strengths and limitations
The use of realist-informed process evaluation was a key strength. A theory-led framework analysis offered perspectives of context, implementation process and mechanisms at work within the collaboratives. The mixed-methods design offered the opportunity to gather rich qualitative and quantitative data to examine how QICs work.
A limitation of this evaluation was the use of the QIKAT-R survey to measure knowledge about quality improvement. The survey was presented in a way that led to participants focusing on clinical responses rather than a process improvement approach, resulting in low scores. The interview data provided stronger evidence of improved knowledge and skills. Small numbers of participants in the evaluation limited statistical analysis but still offered a rich exploration of the mechanisms and contextual factors affecting their learning.
Conclusion
This study addresses a strategy to improve dementia care. A QIC designed to suit geographically dispersed clinicians in different settings and roles was acceptable and feasible in building knowledge and skills to improve dementia care. The motivations of clinicians and the credibility of the collaborative process empowered clinicians to counter pessimism to improve dementia care. This offers insight into how preconceived ideas of what is possible in dementia care in complex and resource constrained contexts can be overcome.
Collaborators: Experts-by-experience of dementia were involved in the conduct of the trial of QIC to improve adherence to clinical guidelines. One person Jane Thompson, was an investigator involved in the design and conduct of the trial, a member of the management steering group and provided advice and comments on the text and form of acknowledgement of the experts-by-experience. The other experts-by-experience of dementia were not involved in the process evaluation but contributed as advisors on the QIC.They are: Nadine Hedger,Ian Gladstone,John Quinn,Glenys Petrie,Gary Collins,Mae Collins.
Contributors: MCa coordinated the trial, assisted with recruitment of participants in the evaluation, data collection, and provided advice and comments on the text. GB assisted with survey analysis and provided advice and comments on the text. GR assisted with data collection and analysis and provided comments on the text. BK was an investigator on the trial on which the evaluation reports and provided advice and comments on the text. MC was an investigator on the trial on which the evaluation reports, involved in the design and delivery of the trial, provided advice and comments on the text. JAF was an investigator on the trial on which the evaluation reports, was involved in the design and delivery of the trial, provided advice and comments on the text. SK was an investigator on the trial on which the evaluation reports, involved in the design and delivery of the trial, provided advice and comments on the text. IC was an investigator on the trial on which the evaluation reports, involved in the design and delivery of the trial, provided advice and comments on the text. CW was an investigator on the trial on which the evaluation reports and provided comments on the text. JT was an investigator on the trial on which the evaluation reports, involved in the design and delivery of the trial, provided advice and comments on the text. KL was the chief investigator of the trial on which the evaluation reports, developed the idea and design for the trial, secured funding and supervised the delivery of the trial and evaluation. She supervised LdlP’s research and candidature, provided advice and comments on the text and acts as guarantor with LdlP.
Funding: This work was supported by the National Health and Medical Research Council (NHMRC) Partnership Centre on Dealing with Cognitive and Related Functional Decline in Older People (grant no. GNT9100000 and an NHMRC Boosting Dementia Research Grant (APP1135667). IC is supported by an Australian Health and Medical Research Council Senior Practitioner Fellowship. KL is supported by an Australian Health and Medical Research Council Dementia Research Development Fellowship.
Competing interests: MCa has been employed in the last 5 years to assist with data collection for Alzheimer’s disease drug trials funded by Janssen and Merck.
Provenance and peer review: Not commissioned; externally peer reviewed.
Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.
Data availability statement
Data are available on reasonable request. Consent was not sought for individual participant data to be available.As sample sizes were small only aggregated data or deidentified data may be available.
Ethics statements
Patient consent for publication:
Not required.
Ethics approval:
Ethical approval for the study was granted by the Southern Adelaide Clinical Human Research Ethics Committee (HREC/17/SAC/88).
Acknowledgements
We gratefully acknowledge the involvement of people living with dementia and caregivers as experts-by-experience of dementia in the Agents of Change Quality Improvement Collaborative.
Chan WV, Pearson TA, Bennett GC, et al. ACC/AHA Special Report: Clinical Practice Guideline Implementation Strategies: A Summary of Systematic Reviews by the NHLBI Implementation Science Work Group: A Report of the American College of Cardiology/American Heart Association Task Force on Clinical Practice Guidelines. J Am Coll Cardiol2017; 69:1076–92. doi:10.1016/j.jacc.2016.11.004•Google Scholar•PubMed
OECD. Care needed: improving the lives of people with dementia. Paris, OECD2018; Google Scholar
Groves A, Thompson D, McKellar D, et al. The Oakden report. Adelaide, South Australia, SA Health, Department for Health and Ageing, Health S2017; Google Scholar
Royal Commission into Aged Care Quality and Safety. Interim report. Adelaide. 2019; Google Scholar
Banerjee S, Wittenberg R. Clinical and cost effectiveness of services for early diagnosis and intervention in dementia. Int J Geriatr Psychiatry2009; 24:748–54. doi:10.1002/gps.2191•Google Scholar•PubMed
MacLeod CA, Bu F, Rutherford AC, et al. Cognitive impairment negatively impacts allied health service uptake: investigating the association between health and service use. SSM Popul Health2021; 13. doi:10.1016/j.ssmph.2020.100720•Google Scholar•PubMed
World Health Organisation. Global action plan on the public health response to dementia 2017 - 2025. Geneva, World Health Organisation2017; Google Scholar
Institute for Healthcare Improvement. The Breakthrough Series : IHI’s Model for Achieving Breakthrough Improvement. Cambridge Massacheusetts, Institute for Healthcare Improvement2003; Google Scholar
Nadeem E, Olin SS, Hill LC, et al. Understanding the components of quality improvement Collaboratives: a systematic literature review. Milbank Q2013; 91:354–94. doi:10.1111/milq.12016•Google Scholar•PubMed
Schouten LMT, Hulscher MEJL, van Everdingen JJE, et al. Evidence for the impact of quality improvement Collaboratives: systematic review. BMJ2008; 336:1491–4. doi:10.1136/bmj.39570.749884.BE•Google Scholar•PubMed
de la Perrelle L, Radisic G, Cations M, et al. Costs and economic evaluations of quality improvement collaboratives in healthcare: a systematic review. BMC Health Serv Res2020; 20. doi:10.1186/s12913-020-4981-5•Google Scholar•PubMed
Devi R, Meyer J, Banerjee J, et al. Quality improvement collaborative aiming for proactive healthcare of older people in care homes (peach): a realist evaluation protocol. BMJ Open2018; 8. doi:10.1136/bmjopen-2018-023287•Google Scholar•PubMed
Zubair M, Chadborn NH, Gladman JRF, et al. Using comprehensive geriatric assessment for quality improvements in healthcare of older people in UK care homes: protocol for realist review within proactive healthcare of older people in care homes (peach) study. BMJ Open2017; 7. doi:10.1136/bmjopen-2017-017270•Google Scholar•PubMed
Schouten LM, Hulscher ME, van Everdingen JJ, et al. Short- and long-term effects of a quality improvement collaborative on diabetes management. Implement Sci2010; 5. doi:10.1186/1748-5908-5-94•Google Scholar•PubMed
Hulscher MEJL, Schouten LMT, Grol RPTM, et al. Determinants of success of quality improvement Collaboratives: what does the literature show? BMJ Qual Saf2013; 22:19–31. doi:10.1136/bmjqs-2011-000651•Google Scholar•PubMed
Brown V, Fuller J, Ford D, et al. The enablers and barriers for the uptake, use and spead of primary care Collaboratives in Australia. Herston, QLD, APHCRI Centre of Research Excellence in Primary Care Microsystems, Discipline of General Practice, The University of Queensland2014; Google Scholar
Gustafson DH, Quanbeck AR, Robinson JM, et al. Which elements of improvement collaboratives are most effective? A cluster-randomized trial. Addiction2013; 108:1145–57. doi:10.1111/add.12117•Google Scholar•PubMed
Dückers MLA, Spreeuwenberg P, Wagner C, et al. Exploring the black box of quality improvement Collaboratives: modelling relations between conditions, applied changes and outcomes. Implement Sci2009; 4. doi:10.1186/1748-5908-4-74•Google Scholar•PubMed
Algurén B, Nordin A, Andersson-Gäre B, et al. In-Depth comparison of two quality improvement collaboratives from different healthcare areas based on registry data-possible factors contributing to sustained improvement in outcomes beyond the project time. Implement Sci2019; 14:74. doi:10.1186/s13012-019-0926-y•Google Scholar•PubMed
Dixon-Woods M, Bosk CL, Aveling EL, et al. Explaining Michigan: developing an ex post theory of a quality improvement program. Milbank Q2011; 89:167–205. doi:10.1111/j.1468-0009.2011.00625.x•Google Scholar
Zamboni K, Baker U, Tyagi M, et al. How and under what circumstances do quality improvement collaboratives lead to better outcomes? A systematic review. Implement Sci2020; 15:27. doi:10.1186/s13012-020-0978-z•Google Scholar•PubMed
Shearn K, Allmark P, Piercy H, et al. Building realist program theory for large complex and messy interventions. Int J Qual Methods2017; 16:160940691774179. doi:10.1177/1609406917741796•Google Scholar
Rycroft-Malone J, Seers K, Eldh AC, et al. A realist process evaluation within the facilitating implementation of research evidence (fire) cluster randomised controlled international trial: an exemplar. Implement Sci2018; 13:138. doi:10.1186/s13012-018-0811-0•Google Scholar•PubMed
Willis CE, Reid S, Elliott C, et al. A realist evaluation of a physical activity participation intervention for children and youth with disabilities: what works, for whom, in what circumstances, and how? BMC Pediatr2018; 18. doi:10.1186/s12887-018-1089-8•Google Scholar•PubMed
Brand SL, Quinn C, Pearson M, et al. Building programme theory to develop more adaptable and scalable complex interventions: realist formative process evaluation prior to full trial. Evaluation2019; 25:149–70. doi:10.1177/1356389018802134•Google Scholar
Handley M, Bunn F, Goodman C, et al. Dementia-friendly interventions to improve the care of people living with dementia admitted to hospitals: a realist review. BMJ Open2017; 7. doi:10.1136/bmjopen-2016-015257•Google Scholar•PubMed
Ogrinc G, Batalden P, Moore S, et al. Realist Evaluation as a Framework for the Assessment of Teaching About the Improvement of Care/Commentary on "Realist Evaluation as a Framework for the Assessment of Teaching About the Improvement of Care". Journal of Nursing Education2009; 48:661–7. Google Scholar
Moore GF, Audrey S, Barker M, et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ2015; 350. doi:10.1136/bmj.h1258•Google Scholar•PubMed
Maben J, Taylor C, Dawson J, et al. A realist informed mixed-methods evaluation of Schwartz center Rounds® in England. Health Services Delivery Research2018; 6. Google Scholar
Bernet AC, Willens DE, Bauer MS, et al. Effectiveness-implementation hybrid designs: implications for quality improvement science. Implementation Sci2013; 8:S2. doi:10.1186/1748-5908-8-S1-S2•Google Scholar
Laver K, Cations M, Radisic G, et al. Improving adherence to guideline recommendations in dementia care through establishing a quality improvement collaborative of agents of change: an interrupted time series study. Implement Sci Commun2020; 1:80. doi:10.1186/s43058-020-00073-x•Google Scholar•PubMed
Cations M, Crotty M, Fitzgerald JA, et al. Agents of change: establishing quality improvement collaboratives to improve adherence to Australian clinical guidelines for dementia care. Implement Sci2018; 13:123. doi:10.1186/s13012-018-0820-z•Google Scholar•PubMed
Booth A, Wright J, Briscoe S, et al. Scoping and Searching to Support Realist Approaches, Doing realist research. 55 City Road, London, SAGE Publications Ltd2018; Google Scholar
Finch TL, Girling M, May CR, et al. Normalization Process Theory On-line Users’ Manual, Toolkit and NoMAD instrument [Measurement instrument. UK, NPT 2015 [NoMAD toolkit]Google Scholar
Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci2009; 4. doi:10.1186/1748-5908-4-50•Google Scholar•PubMed
May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalization process theory. Sociology2009; 43:535–54. Google Scholar
Smith JD, Li DH, Rafferty MR, et al. The implementation research logic model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci2020; 15:84. doi:10.1186/s13012-020-01041-8•Google Scholar•PubMed
Bergeron DA, Gaboury I. Challenges related to the analytical process in realist evaluation and latest developments on the use of NVivo from a realist perspective. Int J Soc Res Methodol2020; 23:355–65. doi:10.1080/13645579.2019.1697167•Google Scholar
Guetterman TC, Fetters MD, Creswell JW, et al. Integrating quantitative and qualitative results in health science mixed methods research through joint displays. Ann Fam Med2015; 13:554–61. doi:10.1370/afm.1865•Google Scholar•PubMed
Devi R, Martin G, Banerjee J, et al. Improving the quality of care in care homes using the quality improvement collaborative approach: lessons learnt from six projects conducted in the UK and the Netherlands. Int J Environ Res Public Health2020; 17. doi:10.3390/ijerph17207601•Google Scholar•PubMed
Robert G, Sarre S, Maben J, et al. Exploring the sustainability of quality improvement interventions in healthcare organisations: a multiple methods study of the 10-year impact of the 'Productive Ward: Releasing Time to Care' programme in English acute hospitals. BMJ Qual Saf2020; 29:31–40. doi:10.1136/bmjqs-2019-009457•Google Scholar•PubMed
Knight AW, Dhillon M, Smith C, et al. A quality improvement collaborative to build improvement capacity in regional primary care support organisations. BMJ Open Qual2019; 8. doi:10.1136/bmjoq-2019-000684•Google Scholar•PubMed
Emmel N, Greenhalgh J, Manzano A, et al. Doing realist research. London, Sage2018; Google Scholar
Lacouture A, Breton E, Guichard A, et al. The concept of mechanism from a realist approach: a scoping review to facilitate its operationalization in public health program evaluation. Implement Sci2015; 10:153. doi:10.1186/s13012-015-0345-7•Google Scholar•PubMed
Coury J, Schneider JL, Rivelli JS, et al. Applying the Plan-Do-Study-Act (PDSA) approach to a large pragmatic study involving safety net clinics. BMC Health Serv Res2017; 17. doi:10.1186/s12913-017-2364-3•Google Scholar•PubMed