Introduction ‘Systems thinking’ is often recommended in healthcare to support quality and safety activities but a shared understanding of this concept and purposeful guidance on its application are limited. Healthcare systems have been described as complex where human adaptation to localised circumstances is often necessary to achieve success. Principles for managing and improving system safety developed by the European Organisation for the Safety of Air Navigation (EUROCONTROL; a European intergovernmental air navigation organisation) incorporate a ‘Safety-II systems approach’ to promote understanding of how safety may be achieved in complex work systems. We aimed to adapt and contextualise the core principles of this systems approach and demonstrate the application in a healthcare setting.
Methods The original EUROCONTROL principles were adapted using consensus-building methods with front-line staff and national safety leaders.
Results Six interrelated principles for healthcare were agreed. The foundation concept acknowledges that ‘most healthcare problems and solutions belong to the system’. Principle 1 outlines the need to seek multiple perspectives to understand system safety. Principle 2 prompts us to consider the influence of prevailing work conditions—demand, capacity, resources and constraints. Principle 3 stresses the importance of analysing interactions and work flow within the system. Principle 4 encourages us to attempt to understand why professional decisions made sense at the time and principle 5 prompts us to explore everyday work including the adjustments made to achieve success in changing system conditions.
A case study is used to demonstrate the application in an analysis of a system and in the subsequent improvement intervention design.
Conclusions Application of the adapted principles underpins, and is characteristic of, a holistic systems approach and may aid care team and organisational system understanding and improvement.
- quality improvement
- human factors
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
Adopting a ‘systems thinking’ approach to improvement in healthcare has been recommended as it may improve the ability to understand current work processes, predict system behaviour and design modifications to improve related functioning.1–3 ‘Systems thinking’ involves exploring the characteristics of components within a system (eg, work tasks and technology) and how they interconnect to improve understanding of how outcomes emerge from these interactions. It has been proposed that this approach is necessary when investigating incidents where harm has, or could have, occurred and when designing improvement interventions. While acknowledged as necessary, ‘systems thinking’ is often misunderstood and there does not appear to be a shared understanding and application of related principles and approaches.4–6 There is a need, therefore, for an accessible exposition of systems thinking.
Systems in healthcare are described as complex. In such systems it can be difficult to fully understand how safety is created and maintained.7 Complex systems consist of many dynamic interactions between people, tasks, technology, environments (physical, social and cultural), organisational structures and external factors.8–10 Care system components can be closely ‘coupled’ to other system elements and so change in one area can have unpredicted effects elsewhere with non-linear, cause–effect relations.11 The nature of interactions results in unpredictable changes in system conditions (such as patient demand, staff capacity, available resources and organisational constraints) and goal conflicts (such as the frequent pressure to be efficient and thorough).12 13 To achieve success, people frequently adapt to these system conditions and goal conflicts. But rather than being planned in advance, these adaptations are often approximate responses to the situations faced at the time.14 Therefore, to understand safety (and other emergent outcomes such as workforce well-being) we need to look beyond the individual components of care systems to consider how outcomes (wanted and unwanted) emerge from interactions in, and adaptations to, everyday working conditions.14
Despite the complexity of healthcare systems, we often appear to treat problems and issues in simple, linear terms.15–17 In simple systems (eg, setting your alarm clock to wake you up) and many complicated systems (eg, a car assembly production line) ‘cause and effect’ are often linked in a predictable or linear manner. This contrasts sharply with the complexity, dynamism and uncertainty associated with much of healthcare practice.1 7 18 For example, in a study to evaluate the impact of a comprehensive pharmacist review of patients’ medication after hospital discharge, the linear perspective suggested that this specific intervention would improve the safety and quality of medication regimens and so reduce healthcare utilisation.19 Unexpectedly the opposite result was observed. The authors suggested that this emergent outcome may have been due to the increased number of interactions with different healthcare professionals increasing the complexity of care resulting in greater anxiety, confusion and dependence on healthcare workers.
Analyses of safety issues in healthcare routinely examine how safety is destroyed or degraded but have surprisingly little to say about how it is created and maintained. In the UK, like many parts of the world, root cause analysis is the recommended method for analysing events with an adverse outcome.20 At its best, this should take a ‘systems approach’ to identify latent system conditions that interacted and contributed to the event and recommend evidence-based change to reduce the risk of recurrence.20 However, we find that the results of such analyses are commonly based on linear ‘cause and effect’ assumptions and thinking.15 16 21 22 Despite allusions to ‘root causes’, investigation approaches have a tendency to focus on single system elements such as people and/or items of equipment, rather than attempting to understand the interacting relationships and dependencies between people and other elements of the sociotechnical system from which safety performance and other outcomes in complex systems emerge.21 By focusing on components in isolation, proposed improvement interventions risk unintended consequences in other parts of the systems and enhanced performance of the targeted component rather than the overall system. The validity of focusing on relatively infrequent, unwanted events has been questioned as it does not always reveal how wanted outcomes usually occur and may limit our learning on how to improve care.22
Despite much related activity internationally, the impact of current safety improvement efforts in healthcare is limited.23–25 Similar to other safety-critical industrial sectors, such as nuclear power or air traffic control, there is a growing realisation in healthcare that exploring how safety is created in complex systems may add value to existing learning and improvement efforts. The European Organisation for the Safety of Air Navigation (EUROCONTROL), a pan-European intergovernmental air navigation organisation, published a white paper, Systems Thinking for Safety: Ten Principles.26 This sets out a way of thinking about safety in organisations that aligns with systems thinking and applies ‘Safety-II’ principles, for which there is also growing interest in healthcare.27 This latter approach attempts to explain and potentially resolve some of the ‘intractable problems’ associated with complex systems such as those found in healthcare, which traditional safety management thinking and responses (termed Safety-I) have struggled to adequately understand and improve on.28 The Safety-II approach aims to increase the number of events with a positive outcome by exploring and understanding how everyday work is done under different conditions and contexts. This can lead to a more informed appreciation of system functioning and complexity that may facilitate a deeper understanding of safety within systems.29 30
In this paper, we describe principles for systems thinking in healthcare that have been adapted and contextualised from the themes within the EUROCONTROL ‘Systems Thinking for Safety’ white paper. Our goal was to provide an accessible framework to explore how work is done under different conditions to facilitate a deeper understanding of safety within systems. A case report applying these principles to healthcare systems is described to illustrate systems thinking in everyday clinical practice and how this may inform quality improvement (QI) work.
Adaptation of EUROCONTROL Systems Thinking Principles
A participatory codesign approach31 was employed with informed stakeholders.32 33 First, in March 2016, a 1-day systems thinking workshop was held for participants who held a variety of roles in front-line primary care (general practitioners (GP), practice nurses, practice managers and community pharmacists) and National Health Service (NHS) Scotland patient safety leaders (table 1). The relevance and applicability of the EUROCONTROL white paper system principles were explored through presentations and discussion led by two experts in the field (including the original lead author of this document—SS). This was followed by a facilitated small group simulation exercise to apply the 10 principles to a range of clinical and administrative healthcare case studies (online supplementary appendix 1) (figure 1).
Second, two rounds of consensus building using the Questback online survey tool were undertaken with workshop participants in April and July 2016.34
Finally, in May 2017, two 90 min workshops were held to test and refine the adapted principles with primary and secondary care medical appraisers (experienced medical practitioners with responsibility for the critical review of improvement and safety work performed by front-line peers).
At each stage, feedback was collected and analysed to identify themes related to applicability including wording, merging and missing principles. These themes directed the modification of the original principles and descriptors, which were then used at the next stage of development.
Throughout the process, external guidance and ‘sense-checking’ were provided by a EUROCONTROL human factors expert and lead author of the original systems thinking for safety white paper. While we believe the outputs from this work are generically applicable to all healthcare contexts, we have focused on the primary care setting for pragmatic purposes. The agreed principles are illustrated graphically in the Systems Thinking for Everyday Work (STEW) conceptual model (figure 1), and detailed descriptions are provided in online supplementary appendix 2.
Patient and public involvement
Patients and the public were not involved in the design of the study or the adaptation of the principles. The presented case study included a patient in the application of the principles to analyse the system. A service user read and commented on the manuscript and their feedback was incorporated into the final paper.
Systems Thinking for Everyday Work
The STEW principles consist of six inter-related principles (figure 1, tables 2 and 3, online supplementary appendix 2). A fundamental, overarching conclusion is that the principles should not be viewed as isolated ideas, but instead as inter-related and interdependent concepts that can aid our understanding of complex work processes to better inform safety and improvement work by healthcare teams and organisations.
The foundation concept acknowledges that ‘most healthcare problems and solutions belong to the system’. This emphasises that the aim of applying a systems approach is to improve overall system functioning and not the functioning of one individual component within a system. For example, improving clinical assessments will not improve overall system performance unless patients can access assessments appropriately.
All systems interact with other systems, but out of necessity those analysing the system need to agree boundaries for the analysis. This may mean the GP practice building, a single hospital ward, the emergency department, a pharmacy or nursing home. Despite this, it is important to remember that external factors will influence the system under study and changes may have effects in parts of the system outside the boundary.
Appreciate that people, at all organisational levels and regardless of responsibilities and hierarchical status, are the local experts in the work they do. Exploring the different perspectives held by these people, especially in relation to the other principles, is crucial when analysing incidents and designing and implementing change.
Obtaining multiple perspectives allows an exploration of variability in demand and capacity, availability of resources (such as information or physical resources) and constraints (such as guidance that directs work to be performed in a particular way). These considerations can help identify leading indicators of impending trouble by identifying where demand may exceed capacity or where resources may not be available. Multiple perspectives can also help explore how work conditions affect staff well-being (eg, health, safety, motivation, job satisfaction, comfort, joy at work) and performance (eg, care quality, safety, productivity, effectiveness, efficiency).
Interactions and flow
System outputs are dependent on the constantly changing interactions between people, tasks, equipment and the wider environment. Multiple perspectives on system functioning help explore interactions to better understand the effects of actions and proposed changes on other parts of the system. Examining flow of work can help identify how these interactions and the conditions of work contribute to bottlenecks and blockages.
Understand why decisions made sense at the time
This principle directs us that, when looking back on individual, team or organisational decision-making, we should appreciate that people do what makes sense to them based on the system conditions experienced at the time (demand, capacity, resources and constraints), interactions and flow of work. It is easy (and common) to look back with hindsight to blame or judge individual components (usually humans) and recommend change such as refresher training and punitive actions. This must consider why such decisions were made, or change is unlikely to be effective. The same conditions may occur again, and the same decision may need to be made to continue successful system functioning. By exploring why decisions were made, we move beyond blaming ‘human error’ which can help promote a ‘Just Culture’—where staff are not punished for actions that are in keeping with their experience and training and which were made to cope with the work conditions faced at the time.35
As work conditions and interactions change rapidly and often in an unpredicted manner, people adapt what they do to achieve successful outcomes. They make trade-offs, such as efficiency thoroughness trade-offs, and use workarounds to cope with the conditions they face. In retrospect these could be seen as ‘errors’, but are often adaptations used to cope with unplanned or unexpected system conditions. They result in a difference between work-as-done and work-as-imagined and define everyday work from which outcomes, both good and bad, emerge.
The included case report describes the practical application of these principles to understand work within a system and the subsequent design of organisational change (table 3). The presented details are a small part of a larger project in which the authors (DM, PB and SL) were involved. The new appointment of a health board employed pharmacist to a general practice had not had the anticipated impact and there had been unexpected effects. The GPs had hoped for a greater reduction in workload quantity, the health board had hoped for increased formulary compliance and there had been increased workload in secondary care.
Traditional ways of exploring this problem may include working backwards from the problem to identify an area for improvement. In this case, further training of the pharmacist may have been suggested and targets may have been introduced in relation to workload or formulary compliance. However, without understanding why the pharmacist worked this way, it is likely any retraining or change would be ineffective. The STEW principles provided a framework to analyse the problem from a systems perspective, understand what influenced the pharmacist’s decisions and explore the effects of these decisions elsewhere in the system. Obtaining multiple perspectives identified that the pharmacist had to trade off between competing goals (productivity vs thoroughness including safety and formulary compliance). The application of the principles identified how pharmacists varied their approach to increase productivity while remaining safe. Learning from this everyday work helped bring work-as-done and work-as-imagined closer and several changes to improve system performance were identified and implemented.
Access to hospital electronic prescribing information
This ensured pharmacists had the information needed to complete the task (System condition—resources). It also reduced work in other sectors (Interactions) and increased the efficiency of task completion and so reduced delays for patients (Flow).
The timetable for the week was changed to prioritise other prescribing tasks at the start of the week and complete medication reconciliation later in the week (System condition—capacity/demand). Through discussion of system conditions, the pharmacist identified that certain discharges took longer to complete, resulted in further contact with the practice (with a resultant increased GP workload) or had an increased risk of patient harm. Discharges that included these factors were prioritised and completed early in the week in attempt to mitigate these problems.
Protocols were changed to have minimum specification to allow local adaptation by pharmacists (System conditions—constraints). This supported the pharmacists to employ a variety of responses dependent on the context (Performance Variability) which reduced pharmacists’ concerns of blame if they did not follow the protocol (Understand why decision made sense). For example, after a short admission where it was unlikely medication was changed, pharmacists did not need to contact secondary care regarding medication not recorded on the discharge letter (Understand why decision made sense). If they felt they did have to check, the option of contacting the patient was included. Similarly, the need to contact all patients after discharge was removed. Pharmacists could use other options such as contacting the community pharmacy if more appropriate (Performance Variability).
Regular GP mentoring sessions were included as pharmacists’ found discussing cases with GPs allowed them to consider the benefits and potential problems of their actions in other parts of the system (Interactions and Performance Variability). For example, not limiting the number of times certain medication can be issued but instead ensuring practice systems for monitoring are used. This also allowed them to consider when they needed to be more thorough at the expense of efficiency (Performance Variability), for example, when there were leading indicators of problems such as high-risk medication.
This paper describes the adaptation and redesign of previously developed system principles for generic application in healthcare settings. The STEW principles underpin and are characteristic of a holistic systems approach. The case report demonstrates application of the principles to analyse a care system and to subsequently design change through understanding current work processes, predicting system behaviour and designing modifications to improve system performance.
We propose that the STEW principles can be used as a framework for teams to analyse, learn and improve from unintended outcomes, reports of excellent care and routine everyday work ‘hassles’.36 37 The overall focus is on team and organisational learning by, for example, small group discussion to promote a deep understanding of ‘how everyday work is actually done’ (rather than just fixating on things that go wrong). This allows an exploration of the system conditions that result in the need for people to vary how they work; the identification and sharing of successful adaptations and an understanding of the effect of adaptations elsewhere in the system (mindful adaptation). From this, we can decide if variation is useful (and thus support staff in doing this effectively) or unwanted (and system conditions can then be considered to try to damp variation). These discussions can help reconcile work-as-done and work-as-imagined. Although, as conditions change unpredictably, new ways of working will continue to evolve and so we must continue to explore and share learning from everyday work, not just when something goes wrong.
The focus of safety efforts, in incident investigation and other QI activity, is often on identifying things that have gone wrong and implementing change to prevent ‘error’ recurring.20 The focus is often on the ‘root causes’ of adverse events or categorising events most likely to cause systems to fail (eg, using Pareto charts).20 38 This linear ‘cause and effect’ thinking can lead to single components, deemed to be the ‘cause’ of the unwanted event or care problem, being prioritised for improvement. Although this may improve the performance of that component it may not improve overall system functioning and, due to the complex interactions in healthcare systems, may generate unwanted unintended consequences. The principles promote examining and treating the relevant system as a whole which may strengthen the way we conduct incident investigation and how we design QI projects.
To successfully align corrective actions or improvement interventions with contributing factors, and therefore ensure actions have the desired effect, a deep understanding of everyday work is essential.39 Methods such as process mapping are often promoted to explore how systems work which, when used properly, can be a useful method to aid healthcare improvers. To more closely model and understand work-as-done, the STEW principles could be considered to show the influences on components that affect performance such as feedback loops, coupling to other components and internal and external influences.
The STEW principles may also support another commonly used QI method: Plan, Do, Study, Act cycles.40 It has been suggested that more in-depth work is often required in the planning and study stages of improvement activity, especially when dealing with complex problems.40 The application of the principles may help explore factors that will influence change (such as resources, interactions with other parts of the systems and personal and organisational goals). Similarly, during the study phase, the principles can help explore how system properties prompted people to act the way they did. This level of understanding can then inform further iterative cycles.
Patient care is often delivered by teams across interfaces of care which further increases complexity.41 It is estimated that only around half to three-quarters of actions recommended after incident analysis are implemented.21 Although this is often due to a lack of shared learning and local action plans and involvement of key stakeholders,21 those investigating such cases may feel unable to influence change in such a complex environment. This may result on a focus on what is perceived as manageable or feasible changes to single processes. Obtaining multiple perspective on work and improvement encourages a team-based approach to learning and change but systems are still required to ensure learning and action plans are shared. Although the principles have been used in incident investigation and to influence organisational change across care interfaces, simply introducing a set of principles alone will not improve the likelihood of the implementation of effective system-level change.42 43 Training on, and evaluation of, the application of the principles is required.
Understanding how safety is created and maintained must involve more than examining when it fails. Improvement interventions often aim to standardise and simplify current processes. Although these approaches are important, in a resource-limited environment, it will never be possible to implement organisational change to fix all system problems. Even if this was possible, as systems evolve with new treatments and technology, conditions will emerge that have not been considered. To optimise success in complex systems, the contribution of humans to creating safety needs to be explored, understood and enhanced.44 Human adaptation is always required to ensure safe working and needs to be understood, appreciated and supported. Studying systems using the principles may support workers who make such adaptations to be more mindful of wider system effects.
There is growing interest in healthcare in how we can learn more from how people create safety. The Learning from Excellence movement promotes learning and improvement from the analysis of peer-reported episodes of excellent care and positive deviancy aims to identify how some people excel despite facing the same constraints as others.36 45 The Safety-II systems approach that influenced these principles is similar in that it focuses on how people help to create safety by adapting to unplanned system factors and interactions.
By understanding why decisions are made, the application of the principles supports the development of a ‘Just Culture’—indeed this was one of EUROCONTROL’s original principles and was incorporated into the principle, ‘Understand why decisions make sense at the time’. A ‘Just Culture’ has been described as ‘a culture of trust, learning and accountability’, where people are willing to report incidents where something has gone wrong, as they know it will inform learning to improve care and not be used to assign blame inappropriately.35 Our approach aims to avoid unwarranted blame and increase healthcare staff support and learning when something has gone wrong.46 47 Furthermore, application of the principles may empower staff and patients to not just report incidents but contribute to analysis and become integral parts of the improvement process through coproduction of safer systems. Obtaining the perspective of the patient when applying the principles is critical to understanding and improving systems as they are often the only constant when care crosses interfaces. This type of approach to improvement is strongly promoted and may avoid short-sighted responses to patient safety incidents (eg, refresher training or new protocols) and result in the design of better, and more cost-effective care systems.48
Alternative methods exist for modelling and understanding complex systems, such as the Functional Resonance Analysis Method,49 and a complex systems approach is used in accident models such as the Systems Theoretic Accident Modelling and Processes50 and AcciMAPs.51 These robust methods for system analysis are difficult for front-line teams to implement without specialised training.29 The principles, on the other hand, were designed with front-line healthcare workers in order to allow non-experts to be able to adopt this type of thinking to understand and improve systems. The influence of conditions of work, including organisational and external factors, on safety has been appreciated for some time and is included in other models used in healthcare to explore safety in complex systems.52–54 The Systems Engineering Initiative for Patient Safety (SEIPS) model is arguably one of the best known systems-based frameworks in healthcare.53 While this model promotes seeking multiple perspectives to describe the interactions between components, the STEW principles focus on how these interactions influence the way work is done and thus may complement the use of the SEIPS model.
Strength and limitations
Any consensus method can produce an agreed outcome, but that does not mean these are wholly adequate in terms of validity, feasibility or transferability. Only 15 participants were involved in the initial development with 32 more in workshops; however, a wide range of professions with significant patient safety and QI experience were recruited. The appraiser workshop was attended by both primary and secondary care doctors, and other staff groups. Their comments were used to further refine the principles, but no attempt was made to assess their agreement on the importance and applicability of principles. The principles have not been shown in practice to improve performance, and further research and evaluation of their application in various sectors of healthcare is needed.
Systems thinking is essential for examining and improving healthcare safety and performance, but a shared understanding and application of the concept is not well developed among front-line staff, healthcare improvers, leaders, policymakers, the media and the general public. It is a complicated topic and requires an understandable framework for practical application by the care workforce. The developed principles may aid a deeper exploration of system safety in healthcare as part of learning from problematic situations, everyday work and excellent practices. They may also inform more effective design of local improvement interventions. Ultimately, the principles help define what a ‘systems approach’ actually entails in a practical sense within the healthcare context.
Under UK ‘Governance Arrangements for Research Ethics Committees’, ethical research committee review is not required for service evaluation or research which, for example, seeks to elicit the views, experiences and knowledge of healthcare professionals on a given subject area.55 Similarly ‘service evaluation’ that involves NHS staff recruited as research participants by virtue of their professional roles also does not require ethical review from an established NHS research ethics committee.
The authors thank all those who contributed to the adaptation of the principles and Michael Cannon for his comments from a service user’s perspective.
Twitter @duncansmcnab, @pbnes
Contributors DM, JM and PB conceived the project. SS developed the original principles and led the consensus building workshop. DM and SL collected the data. DM, SL, SS, JM and PB analysed the feedback to adapt the principles. DM drafted the original report and SL, SS and JM revised and agreed on the final manuscript.
Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests None declared.
Patient and public involvement Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research.
Patient consent for publication Not required.
Provenance and peer review Not commissioned; externally peer reviewed.
Data availability statement Data are available upon reasonable request. Data are available upon request relating to the stages of the consensus building process.