Article Text
Abstract
Introduction Online resources are an important source of information about mental health issues and services for children and young people. Our service’s website had an out-of-date appearance and was aimed at professionals. More importantly, comments in our routinely collected patient experience data indicated that service users did not know what to expect when coming to our service.
Methods We followed the model for improvement by testing out changes in plan, do, study and act cycles that included a review of recently updated child and adolescent mental health services’ and youth charities’ websites, designing a new web page for our service and then testing out the website in focus groups. We used routinely collected patient experience data to assess impact on wider patient satisfaction.
Results Focus groups involving patients, parents and professionals judged the new website to be clearer, more attractive and easier to understand. Routine patient experience data did not reveal any website-specific feedback.
Conclusion This study demonstrates that it is easy and possible to create an attractive and accessible website for a mental health service using quality improvement methodology. In order to capture and integrate ongoing feedback about a service’s website from service users, routinely collected patient experience measures would need to ask specific questions related to this area. In this study, preproject and postproject patient experience data did not generate any specific comments.
- mental health
- quality improvement
- consumer health information
- access to information
- information technology
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
- mental health
- quality improvement
- consumer health information
- access to information
- information technology
Problem
The King’s College Hospital paediatric liaison service (PLS) is a mental health team working within a busy inner city acute trust setting. The team is small, consisting of approximately 10 staff, including doctors, specialist nurses, psychologists and administrative staff. The PLS assesses patients under 18 years who present to the King’s Emergency Department in mental health crisis, as well as providing outpatient clinics specialising in functional illness and neuropsychiatry. The service’s routinely collected patient experience data revealed comments by service users describing concern about what was involved in coming to the PLS for outpatient work.
Not knowing what to expect was both scary for me and my daughter.
A paper information leaflet explaining what the service did and what to expect from an appointment was already available, but this was out of date and generally only given to people prior to their first appointment. Asking service users informally during appointments, they expressed a preference for being able to source information online about the service online. The service had a set of web pages on the existing Trust website, but this was written in professional language and had an outdated and inaccessible interface; the Trust website had been awaiting an update for some years, but at the time of the project, no date was set for redesign, hence the decision to undertake this work. The project aimed to improve the information available to service users referred to the PLS by having a clear and accessible website.
Service user involvement took place at the second stage of the website’s development, modifying the initial (beta) version. The beta version was created by AA using Googles sites, with the design and content based around that of other organisations (other mental health trusts and charities with contemporary websites e.g Young Minds and Mind). The project made the assumption that these organisations had been able to afford and organise wider user testing and consultation than the PLS could afford themselves.
The project aimed to improve preassessment patient experience. Since the Trust within which PLS operates routinely seeks service user experience feedback via the Friends and Family Test (FFT), the project aimed to capture feedback about the website within the free text boxes in the Trust version of the FFT.
Background
Online resources about mental health are an important source of information for young people,1 and mental health service providers should be able to guide young people to appropriate online information.2 Websites are deemed to be a way to reduce barriers for adolescents accessing mental health services.3 Despite a 2014 systematic review4 highlighting that more research was needed to assess if websites facilitate help-seeking in young people, the authors commented: ‘across all studies, young people regularly used and were generally satisfied with online mental health resources’. Given this context, mental health service providers have been asked to consider providing online resources as an ‘adjunct to offline help seeking’.5
Our service pages within the Trust website were aimed at professionals only, were difficult to find and did not give details about what to expect at a first assessment or how to travel to the clinic. The Trust website was awaiting redesign, but this wait had been some years with no clear date set. We decided we needed a web resource more rapidly than the larger organisation would allow. We therefore conducted a survey of other child mental health and mental health websites, as well as carrying out a literature review to explore whether online service resources improved service user experience of a mental health service. No studies were detected. We also researched the process for designing and user testing websites. The latter necessitates usability testing, which is used to evaluate user satisfaction and experience with products and systems including websites.6 The process is recognised for use in the development of health promotion websites. Key themes in testing are ‘design, feedback, format, instructions, navigation, terminology and learnability’.7
Measurement
We collected service user and staff feedback about the service’s pages on the existing Trust website and then compared these with their feedback about our new template website. We used a five-item questionnaire in a series of family groups held in our clinic or sent out by email to local general practioners. Questions asked the participants’ opinions of the layout, colour scheme and pictures on a five-point emoji scale of 1 (unhappy face) to 5 (smiling face). We also asked if the information on the page was easy to understand, if there were any specific words that they did not understand and whether the page helped them understand what our team did. We also had space for free-text comments (online supplemental appendix 1). A staff member was on hand to deal with any queries by the family groups.
Supplemental material
The rationale behind choosing these measures was that we could quantify satisfaction with the design and content of the website, which we believed were key factors in producing an informative and useful health website.7
We also looked at overall patient experience of our service by reviewing our routinely collected Trust FFT data, which also collects free-text comments. We reviewed the responses 6 months preimplementation and 6 months postimplementation of our new website to see if there were any significant changes overall satisfaction or in comments made.
Design
Our project used ideas from the model of improvement testing out changes in small plan, do, study and act (PDSA) cycles.8 However, we also wanted to use methods from usability testing, as we knew this would allow service users’ ideas to be incorporated into our website.7 Regular project team meetings, some including our Trust quality improvement (QI) support team, were held to evaluate the project’s progress and fidelity to the QI methodology. Participants were recruited into family feedback groups via convenience sampling; service users who had appointments at our service were asked if they would like to participate and if they had time. Participants were stratified in order to capture the range of service users we see: children, young people, parents/carers and staff. Participants were asked to review the old and new website design and navigability, as well as complete several tasks on the new website (see online supplemental appendix for details). They then completed the feedback questionnaires. This report was written using the Standards for Quality Improvement Reporting guideline.9 Results figures were produced using Microsoft Excel.10
Patient and public involvement
Direct service user feedback was key to producing a website that would be useful for service users. The latter were consulted at the usability group testing stage, and we planned to invite keen and interested individuals onto the ongoing project team. This final step did not happen because the Trust began their Trust website redesign, which recruited other service users onto the larger design team. Learning from this project was forwarded to the lead of this larger piece of work.
Strategy
PDSA 1: review of existing websites in order to produce a driver diagram
AA and VD reviewed websites from all existing London child and adolescent mental health services (CAMHS) websites and youth mental health charity websites published online in March 2019. The findings were then discussed with the rest of the team and key factors agreed for any new website aimed at service users and stakeholders. Following this, a driver diagram was created using the LifeQI web program.11 This broke down the website into key factors and the attendant website features that followed from this (figure 1).
PDSA 2: creating a template website and testing with users
The website was created using Google sites12 free online software and incorporated as many of the features included in the driver diagram as possible.
Having encountered resistance from the Trust communications team, who were understandably resistant to PLS hosting their own website, consent to progress the project was agreed on the basis that PLS had been asking for a better website interface for 6 years, no launch date was at that point set for the Trust website redesign and provided the PLS website adhered to the Trust guidelines for web pages (including being hosted on an nhs.uk page). We applied for the latter, which was quickly approved.
Having created the beta version of the new PLS website, a series of usability testing focus groups were held with service users (children, young people and parents/carers) and stakeholders. These were used to collect opinions on the pre-existing Trust PLS website pages and the new PLS website. These sessions included completing brief tasks, aimed at assessing how easily, or not, users could navigate the new website.
The tasks created by the project team related to the two main reasons for involvement with the PLS, either because of an outpatient referral or in an emergency assessment. The tasks were targeted to each user group, for example, young people: ‘Imagine you’ve come to A&E after taking an overdose and the nurse tells you they have asked the Paediatric Liaison Team to see you. Look at our website – would this give you an idea of what to expect from seeing us?’ or parent: ‘You’re bringing your child for an appointment about unexplained physical symptoms/for an epilepsy surgery assessment/that your oncologist requested. Look at our website: does it help you know where to go, what to expect and who needs to attend?’ (online supplemental appendix 1).
PDSA 3: using user feedback to make website changes
The information collected from the focus groups was then discussed in the project team and used to update the website. This continued to be hosted in real time. Because the website had been created ‘in-house’ (as opposed to by an external agency), there was no requirement for changes to be reapproved by the Trust communications team as long as changes adhered to Trust guidance.
We reviewed results from FTT data to see if there were any changes to patient experiences.
PDSA 4: feedback to a wider trust project
Our original project plan was to continue cycles of focus group testing, with the inclusion of one or more service users in our project team; one young person had already been approached and agreed to do this. This plan was overtaken when the Trust announced that they were going live with redesigning the entire Trust website, including the web pages for all services, including PLS. We therefore forwarded our results onto the wider trust project lead.
Results
Our sample of service users was stratified into children under 13 years (n=2), young people aged 14–17 years (n=3), parents (n=2) and professionals (n=2). Two-thirds of the sample were female, and the majority were Caucasian.
In this section, we refer to the existing Trust PLS website pages as the ‘old website’ and the newly designed PLS beta version website as the ‘new website’. Overall, the new website was rated better by our focus group participants (testers) than the old website in all areas (figure 2).
In terms of design, questions were broken down into what testers thought about the layout and colour scheme (old website 3.5/5, new website 4.8/5) and pictures (old website 2.75/5, new website 4.6/5).
In terms of content, only 56% of testers felt the information on the old website was easy to understand, compared with 100% of testers on the new website. Twenty-eight per cent found there were words they did not understand on the old website:
Unipolar – some of the conditions weren’t very easy to understand. (Latency-aged patient)
The language used felt very clinical/professional with limited practical support. It is very easy to get lost & confused by the number of different departments. (Middle-aged parent)
Zero per cent of participants felt there were words they did not understand on the new website.
All of the wording was easy to understand. (Latency-aged patient)
Only 55% of testers felt the old website helped them understand what our team did.
The overview page is completely just covered with a long passage containing lots of information which is a bit overwhelming and it’s difficult to take all of the information in. (Teenage patient)
One hundred per cent of testers felt the new website helped them understand what our team did.
The new layout is incredibly clear. I could immediately understand what I needed, why I was there & what you do. The language is clear, non-medical & warm. The site looks supportive (if a site could look like that) but the overall impression is I know clearly what you do & I know what to do in a crisis. I wasn't pulled away to any other areas. It is great! (Middle-aged parent)
Usability tasks and free-text comments were able to highlight areas for improvement for our new website. These included:
Adding a statement about how service users can help their friends and that ‘You can tell the doctors if you feel uncomfortable’.
Changing the sentence structure.
Changing ‘contact us’ to ‘find/contact us’ so clearer that the section included directions.
Using bold text to highlight different clinic areas in the professionals’ section.
Colour changes.
Possible changes were discussed in the project team and then made by one of the project team on the website. All changes suggested by service users were implemented.
Reviewing the FFT data, there were no clear changes to patient experience after changing to the new website. The measure asks whether service users would recommend our service to friends and family, and this overall measure was positive both 6 months prior to and post the introduction of the website. There were no specific free-text comments relating to the website.
Lessons and limitations
The biggest strength of this project is that we were able to make changes to our new website based on direct feedback from service users with lived experience of using our service. The data we collected showed that service users and stakeholders found our new website to be better designed and provide more understandable information.
As a small service, we did not have the manpower or resources to design a website from scratch, involving service users at the start of this project. We had to rely on the service user involvement that is a given in all major child mental health services website redevelopment to generate design and navigation ideas for the first version of our new website.
The project was limited by the small sample size of the usability testing. As such, the suggestions made and implemented may not be generalisable to all our service users. Had the Trust website redesign not overtaken our work, we had intended to continue focus testing in an iterative manner. This would have resulted in a larger sample size and more generalisable feedback.
The questions in our focus group were decided by members of the project group and were not validated measures, so may not have been sensitive enough to detect some changes.
As the patient experience FFT data were general, it was not sensitive enough to measure any changes directly related to the website and was more strongly affected by other factors, such as interaction with professionals at the appointment or clinical workload of the service.
Had the project continued, we would have wanted to test out possible clinical benefits of the new website hosting a ‘crisis information’ page.
Conclusions
Although switching to our new website did not produce any significant changes in patient experience, the feedback, collected directly from usability testing, showed that our new website was felt to be better designed and easier to understand by service users and stakeholders.
Our project emphasises the importance and usefulness of including service users at all levels of service development. Our project’s scope was limited by Trust-level changes.
Further research would be useful to look at whether collaboratively produced websites can help reduce barriers to accessing child and adolescent mental health services such as ours.
Acknowledgments
Dr Bodvar Ymisson and Mr Harry Moss of the King’s Paediatric Liaison Team who helped collect data during usability testing.
Supplementary materials
Supplementary Data
This web only file has been produced by the BMJ Publishing Group from an electronic file supplied by the author(s) and has not been edited for content.
Footnotes
Contributors AA and VD conceived the project idea and designed the project. AA developed the website template. Data were collected by AA, VD, BS, BY and HM. BS collated the data into Excel, which was then analysed and interpreted by AA. The article was drafted by AA with revisions and input from VD. AA and VD approved the final submitted version. AA submitted the paper and is the guarantor.
Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests None declared.
Patient consent for publication Not required.
Provenance and peer review Not commissioned; externally peer reviewed.
Data availability statement All data relevant to the study are included in the article or uploaded as supplementary information.
Supplemental material This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.