Skip to main content

Developing Healthcare Team Observations for Patient Safety (HTOPS): senior medical students capture everyday clinical moments

Abstract

Background

Aviation has used a real-time observation method to advance anonymised feedback to the front-line and improve safe practice. Using an experiential learning method, this pilot study aimed to develop an observation-based real-time learning tool for final-year medical students with potential wider use in clinical practice.

Methods

Using participatory action research, we collected data on medical students’ observations of real-time clinical practice. The observation data was analysed thematically and shared with a steering group of experts to agree a framework for recording observations. A sample of students (observers) and front-line clinical staff (observed) completed one-to-one interviews on their experiences. The interviews were analysed using thematic analysis.

Results

Thirty-seven medical students identified 917 issues in wards, theatres and clinics in an acute hospital trust. These issues were grouped into the themes of human influences, work environment and systems. Aviation approaches were adapted to develop an app capable of recording real-time positive and negative clinical incidents. Five students and eleven clinical staff were interviewed and shared their views on the value of a process that helped them learn and has the potential to advance the quality of practice. Concerns were shared about how the observational process is managed.

Conclusion

The study developed an app (Healthcare Team Observations for Patient Safety—HTOPS), for recording good and poor clinical individual and team behaviour in acute-care practice. The process advanced medical student learning about patient safety. The tool can identify the totality of patient safety practice and illuminate strength and weakness. HTOPS offers the opportunity for collective ownership of safety concerns without blame and has been positively received by all stakeholders. The next steps will further refine the app for use in all clinical areas for capturing light noise.

Peer Review reports

Key messages regarding feasibility

  • What uncertainties regarding feasibility existed prior to this study?

It remains difficult to teach undergraduate healthcare students about the realities of patient safety in complex every day clinical situations. We set out to find a way to address this problem and turned to aviation, where anonymous observation of the crew has advanced safe practice. In this study, we present work to make this a reality for final-year medical students, mindful of acceptability of being an observer and being observed.

  • What are the key findings on feasibility from this study?

Using cyclical action research over several years, we designed an intuitive app for medical students in their final year. Drawing on expertise from aviation and having a wide stakeholder steering group was important as the group received staged outcomes and guided the study. Medical students in this study helped to translate the aviation approach into a method for student learning with opportunities for front-line staff to reflect on their practice. We adapted the aviation methodology for healthcare and moved from a paper data collection tool to an easy-to-use app. We now have a usable observation tool to appreciate the complexity of patient safety. We learnt that it was acceptable to observe good and suboptimal practice.

  • What are the implications of the findings on the design of the main study?

The methodology helped to adapt thinking from aviation to healthcare, but we have work to do on culture change for the acceptability of anonymous observation in all healthcare arenas. Building on this study, we aim to further pilot and test the use of the app in more clinical areas using training and preparation sessions for acceptability by all clinical staff.

Background

In the last decade, the analysis of patient safety events has led to the identification of circumstances which contribute to safe and unsafe patient care [1]. These include not just active failings, but a wider range of more latent influences such as human factors, systems, aspects of the environment and poor professional practice. Interventions to address these contributory factors have led to some reduction in errors but high levels of concern remain [2,3,4,5,6]. The analysis of data after safety events has been helpful, although accurate recording and understanding of the totality of patient safety data remains a challenge, as busy practitioners fail to report events, often because they are not shown the benefits [7,8,9]. There are still big questions to be addressed about how to make changes that will advance standards for safe effective care [10].

Mastering the complexity of healthcare delivery remains tortuous. Hard data for accountability purposes highlights variability in one area of practice at one moment in time but fails to assimilate and understand the integration of care delivery across all levels of an organisation [11]. It often fails to pick up the nuances involved including the patient and practitioner perspectives. Triangulation of data to include patient and staff perceptions may offer a deeper understanding because it values users or front-line stakeholders as owners of the standards of their work [12]. Findings from large-scale studies reveal that front-line practitioners need the right resources, including staffing levels, support and encouragement from leaders who are collaborative, values-based and uphold person-centred care [13]. Practitioners also need simple holistic feedback about ‘uncomfortable information’ and the ‘blind spots’ that need to change [13, 14]. Data collection that is not meaningful to healthcare staff at all levels of an organisation can generate suspicion and resistance and militate against the aspirations for building a safe culture for healthcare delivery [15]. Calls for the next steps for patient safety seek greater clarity about how to identify and measure hazards in real time to intervene before incidents occur [10] and to focus more attention on what works well learning from good practice [16, 17]. Moving from the reactive focus on the negatives, what goes wrong, towards a proactive focus on good practice requires a different approach to seeking and learning from clinical successes. Positive deviance highlights the success of individuals, teams and organisations and seeks to aspire others to adopt positive solutions to achieve effective safe practice [4]. This requires transparency and willingness to share and for others to adapt and adopt tested work patterns within their locality [11].

Potentially valuable sources of knowledge, about the good and the potentially problematic features of care, are groups such as junior clinicians and medical students, who experience multiple settings and so have potentially unique comparative insight. The Francis and Keogh reports both suggested a special place for junior doctors and student nurses in identifying differences in standards of care between the organisations employing them, based on their simultaneous position as insiders and outsiders within the system [18,19,20]. Ladden and colleagues considered the place of students who stand outside and yet are within everyday clinical practice and from this unique vantage point perceive what is going on: ‘Ask any medical resident or graduate nursing student working on the front lines of care about quality and safety problems and you had better be prepared to listen for a while. What they see is what we all read about: serious shortcomings in our systems of care — delays, errors, confused families, and daily workarounds to get patients what they need. Despite their front-line view of the problems, learners are almost never involved in workplace-based experiences to learn about systems improvement’ [21]. Only recently have healthcare curricula been expected to deepen student understandings of patient safety at pre-registration/undergraduate level [22,23,24], despite calls from the World Health Organization who have designed a patient safety curriculum guide [25, 26]. The General Medical Council (GMC), in asking for more teaching on patient safety, has highlighted some medical school programmes where students have been asked to observe what is happening in everyday practice [27]. Despite this, many medical students remain unfamiliar with the scope of learning for safe practice [28]. Transforming healthcare education and equipping students with the right approach is seen as one of the essential changes required, if we are to change culture and advance safe practice [10]. An untested route to identify both good and poor quality care could emerge from proactive students [29]. Students observe poor practice [30] and could also anonymously highlight hazards in everyday practice and simultaneously advance their learning concerning patient safety.

Background to the study

Looking to advance medical student understandings of patient safety and aware that they are observers of the system, we were drawn to observation approaches used in aviation to inform best practice. Adopting learning from high-risk industries for patient safety has already taken place within healthcare delivery [31, 32]. Our journey of discovery started with an aviation strategy to identify the contributory factors which underpin human error. This strategy known as the line operations safety audit (LOSA) is used to collect data about flight crew behaviour and situational factors on flights. The first audit was developed in the USA and has been internationally adopted [33, 34]. The findings collected over time have illuminated how people behave in real-time, offering evidence to improve safety [35, 36]. The attractions of this real-time observation reporting are that the data are collected prospectively and anonymised, with the results fed back to help change practice. This work has contributed to the flattened non-hierarchy team-based culture that characterises modern aviation [37]. While acknowledging important differences between civil aviation and healthcare that are sometimes overlooked in efforts to import safety interventions from one context to the other [38], we nevertheless saw potential in adapting this approach with a view to both identifying influences on healthcare safety and enriching undergraduate education. We outline the process of adaptation for medical student learning.

Methods

Aims and objectives of the study

This exploratory research aimed to develop an observational recording process for patient safety learning by final-year medical students. The study research protocol was funded, in September 2018, by the University of Leicester Wellcome Trust, Institutional Strategic Support Fund (grant RM32J0012M3). The bid aimed to build on our understandings of LOSA as a self-administered (organic), proactive risk-management tool that records ‘Threats’ and ‘Errors’ (International Civil Aviation Organisation, 2002) to develop a usable healthcare normal operations safety audit intervention. There were two intended outcomes from the study: (1) to produce a learning tool for educating medical students in patient safety and (2) to design a patient safety risk-management tool. The development was conceptualised as a staged process. We report on the first stage only.

Study design

The design involved the cyclical collection of student observational data over three iterations, following the principles of participatory action research (PAR) [39], to reach a pragmatic approach for healthcare similar to that used in aviation [40]. In this mixed methods design, the researchers’ understanding of the social setting was developed through the data generated by participants (including participating students’ observations of clinical practice in real-time, and the reflections of those involved in the process); this improved understanding then contributed to further improvement of the intervention over the course of the study [41].

Participants

This iterative process involved a research partnership between the stakeholders and researchers, overseen by a steering group (patients, students, an airline pilot, academics, clinicians, local patient safety leads). The steering group met twice annually, before and after student observations.

The data collectors were final-year medical students. We saw this group as particularly appropriate for the project, since they have undertaken substantial clinical training and so have the ability to understand the constituent parts of good practice, but are not yet embedded in any healthcare organisation, and so retain the perspective of an outsider. There was only one access point to final-year medical students in the annual spring special study module (SSM). Here, students chose from a range of learning possibilities to enhance their abilities to work as junior doctors. This project was submitted as a SSM and attracted students who selected the project annually over the 3 years.

Data collection

Data were collected iteratively over 3 years, following the schedule of final-year medical students completing the SSM. Prior to data collection, the student participants were prepared for their role as researcher data collectors. They spent 1 day reflecting on their learning on patient safety including human factors facilitated by experts. The second day was spent learning about being an observer researcher and becoming familiar with the data collection tools. In the first two cycles of data collection, recordings were made using paper grids which evolved and changed over time. In the third and final cycle, data were collected using the first version of an app developed based on experiences in the first 2 years. Students were asked to record what they saw using scales and with written detailed comments. This year-on-year learning across successive student cohorts enabled changes to be made and in this way the student observations and interactions with real-time clinical events shaped the design of the final app-based data collection mechanism. As the SSM covered 3 weeks, students were able to spend up to 9 days observing, the remainder of their time was spent in feedback and training.

Data analysis

The student scored and written observational data was analysed by the academic researchers (ES and LG) for clarification and agreement with individual students halfway through the learning placement, to identify problems and issues (e.g. inconsistencies, lack of clarity) in recording their observations. At the end of their observation period, students presented their findings for the first time to the clinical staff and senior patient safety leads in the organisation where they had undertaken their observations. The recorded observations were collected and analysed for common themes using the principles of thematic analysis (ES and LG) [42]. The scoring scheme evolved over time from scales of 1–5 indicating the severity of poor practice to a scale which graded both good and poor practice using two levels each, supported by event descriptions.

Ward and other clinical areas covered by the study were selected by the steering group partnership, which involved local hospital patient safety leads and clinicians who were medical educators and familiar with patient safety. The academic clinical leads briefed and prepared the clinical areas to receive the students. Wards, clinics and operating theatres were involved.

In the last cycle, all stakeholder perceptions and experiences of the process were collected using one-to-one semi-structured individual interviews with medical students (observers) and clinical teams (observed practitioners), using an independent researcher (FW) who was not known to the medical students. The interviews continued until theoretical saturation was reached. Interviews were guided by a topic guide focusing on the experiences of both the observers and the observed. With the consent of the participants all interviews were audio recorded and later transcribed and analysed using thematic analysis. In this way, the experiences of the key stakeholders were clarified.

This study received ethical permission from University of Leicester (7741-esa1-medicaleducation).

Results

We present the development of the Healthcare Team Observation for Patient Safety (HTOPS) platform and process chronologically over three stages; the timeline can be seen in Fig. 1. The first stage combines the first two cycles of learning as this was an exploratory phase. The work was refined over 3 years, and we reflect on key learning points that fed into the development and refinement of the system.

Fig. 1
figure 1

Time line of the development of Healthcare Team Observations for Patient Safety (HTOPS)

Stages of the observation tool development

Stage one (2016–2017)

The adaptation of the aviation observation process to identify patient safety concerns started with the final-year special study module (SSM) in 2016 and 2017 (n = 11 in each cohort; total n = 22). We started with aviation terminology, namely ‘Threats’ observed in the working environment and ‘Errors’, i.e. perceived noncompliance with rules/policy/guidelines. In discussion with a clinical team (senior nurse and consultants from a local hospital), a set of possible healthcare threats and errors was agreed through a brainstorming exercise. They included a list of possible ‘Threats’ relating to human factors, technology and building/environment. The ‘Error’ list included noncompliance with rules relating to prescribing, ordering investigations and their interpretation, and patient and practitioner communication. We gave a code number to each possible threat and error. The students were asked to complete observations in a range of clinical areas: two operating theatres receiving for orthopaedic and urology, outpatient fracture clinics, ante-natal wards and clinics, and medical wards. Students were given training on patient safety, observation techniques and the coding system template with the threat and error code list to record what they observed during a session (morning or afternoon) (Table 1). The students spent 6 days observing clinical practice moving between their allocated wards, theatres or clinics.

Table 1 Stage one: design for recording observations (pilot 2016)

Evaluation outcomes

In 2016, students recorded a large number of observations and we analysed a subset of the data (21 scripts) for checking the process (Table 2). We found that students confused threat and error in 13 instances. We refined the paper recording sheet to enable students to write more narrative to justify their findings. In 2017, we analysed all the outcomes from the student observations (n = 373 observations); 43 were illegible and were withdrawn, leaving 330 for full analysis of observed care practice on wards, clinics and theatres, of which there were 22 errors (Table 3). Students continued to have difficulty in differentiating between threats and errors in the midst of the complexity of everyday clinical practice and found the codes cumbersome. However, they were able to step back and observe care delivery in real time, noticing a plethora of concerns relating to both sloppy practice (e.g. hand hygiene) and systems issues (e.g. caused by poor geographical layout). All students reported they had advanced their understandings of patient safety. The observations revealed the students’ lack of familiarity with the setting helped in identifying features that seemed inappropriate, whereas practitioners around them had normalised these practices. Some errors reported were incorrect and misleading, reflecting students’ unfamiliarity with speciality-specific safe practice.

Table 2 Stage one 2016 examples (subset of 21 observations analysed 13 incorrect codes)
Table 3 Cohort 2017: summary outcomes (total 330 of which 22 errors)

The evaluation in 2016 and 2017 led us to the conclusion that the categories of threat and error were too simplistic to capture the complexity of healthcare environment and care delivery (Table 4).

Table 4 Complexity of healthcare. The complexity issues when comparing safety between healthcare and aviation

We revised our categorisation framework to reflect clusters of themes identified in our analysis of the real-time student observations. There were termed tags, as follows (Table 5):

Table 5 Tagging framework HTOPS 2018

Tag 1, human influences. The interactions amongst humans, ‘what I do when I am with others’ and other aspects of healthcare delivery. This includes the way in which one acts or conducts oneself professionally with patients and staff and individuals’ physical actions performed incorrectly or not completed.

Tag 2, work environment. Relating to the physical layout/style and content within the building

Tag 3, systems. Things or parts that function together. The way humans interact with the environment including the level of staff required to function adequately to manage the clinical area

To help identify the level of concerns, each tag was awarded a score on a scale from 1 (a little concern) to 5 (a great deal). In addition, the tag could relate to an individual scored as one person (A—alone) or for practitioners working together in a team (T = team) of practitioners. At this stage we left these senior students to allocate the weight of concern following their patent safety training which explored never events and serious incidents.

Stage two (2018–2019)

In 2018, seven students used the revised paper recording system and worked in pairs in clinical areas. Of the 638 recorded tags, 123 duplicates (students recording the same observation) were removed. At this time, a new electronic database for recording the data was completed and these 515 safety concerns were transferred to the electronic system. These recordings contained 170 scores rated as ‘1’ (low concern), 206 as ‘2’, 107 as ‘3’, 27 as ‘4’, and five scored as serious, ‘5’ (a great deal). The five serious tags were all human influences (tag 1):

  1. i.

    Complacency — action. Anaesthetic drug not labelled during a spinal epidural

  2. ii.

    Confidentiality. Computer system open with patient results for everyone to see

  3. iii.

    Action. Sharps not disposed of correctly during a procedure

  4. iv.

    Team functioning. Surgical whiteboard incorrect documentation of use of needles during surgery

  5. v.

    Team functioning. Change in surgical list led to preparation in theatre for the wrong patient

The concerns from this analysis revealed that it was hard for students to rate the severity of patient safety concerns on a five point scale. For this reason it was decided to reduce the weight of the scale to two points. The steering group reflected on the student feedback and realised that students were also verbally reporting seeing positive behaviour, which the recording system did not allow them to record. It was therefore agreed to capture all that students were seeing and record observations of good practice, resulting in a scale that incorporated two negative (− 1 and − 2) and two positive (+ 1 and + 2) scores for the new app (Table 6 — app design) (Fig. 2).

Table 6 App design
Fig. 2
figure 2

Screenshots of the input form within the app. This shows the input form within the app, both before and after completion. A This insert shows how the data can be presented graphically to total the number of observations by type from positive to negative. B This insert shows how the data can be presented graphically listing the person observed by type. The colour code show the number of this time a particular practitioners were observed and the type of observation from positive to negative. C This insert shows how the data can be presented graphically by colour code showing the descriptor — these can be opened on the app to show the detailed description

Sub-group 2018/2019 — new app

A total of six final-year medical students worked with the new app using iPads in a range of clinical areas (again theatres, clinics and wards) in the same hospital. Ahead of the final SSM, two students used and tested the app in December 2018 and working individually made 28 observations in 2 half-days. The remaining four final-year medical students were trained to use the new app in June 2019. They each spent 3 half-days and made a total of 72 recordings. Together, these findings totalled 100 observations of which 68 were negative and 32 were positive. The majority were again relating to human influences. The app shows these outcomes in a variety of different ways (Fig. 3A–C).

Fig. 3
figure 3

A Data from iPad Observations in 2019. B 2019 Observations — observed practitioner. C Chart showing how to also present the 2019 data using all tag headings

Evaluation of stakeholder perspectives

Stakeholder perceptions were gained from five final-year medical student ‘observers’ and 11 clinical staff members ‘observed’. These were doctors of various grades, scrub nurses, advanced nurse practitioners and nurse ward managers. The data are presented as themes and extracts (Table 7).

Table 7 Qualitative excerpts

The value of the observation method for learning was confirmed by both the students who were observing practice and the observed practitioners. Front-line practitioners perceived the value of the recordings to enhance individual and team learning in clinical practice. For the clinical teams, the work was perceived as a supplement to existing data, such as safer surgery theatre checklists and clinical audits, because it could record a wider range of habitual practices and take account of environmental factors. It was felt that the observation process captured both good and poor practice and that this was helpful for teams in implementing appropriate improvements. The observed practitioners referred to ‘a climate of negativity’ around patient safety and praised the data for allowing the recording of positive clinical practice to provide both balance and an accurate representation of everyday practice. This was something they felt was lost with other recording practices which focused solely on poor practice. Shared learning across clinical areas was discussed as advantageous, particularly the ability to learn from areas showing excellence (Table 7).

Senior medical students perceived this as a good method for student learning on patient safety as they were forced to now see the totality of practice. As observers making the recordings, they recognised that this process helped them to reflect on how to take on an active role within a clinical team. Several students who had qualified by the time they were interviewed described how their observations had fed into their plans for improving their practice.

Acceptability and impact of the observation process was discussed by front-line staff and students. The majority of practitioners were happy to be observed and confirmed that it was acceptable to be observed. Students felt equally comfortable observing all levels of staff grade. There were some concerns from mainly non-medical practitioners, who spoke about feeling additional pressure and being suspicious of being watched (Table 7). At the start of the observations practitioners being observed displayed a kind of Hawthorne effect, with the presence of observers affecting the practice of those observed. They seemed to follow protocols more closely and displayed exemplary behaviour, for example, taking more time than usual to introduce themselves and team members to patients. After a little while, all staff quickly returned to practising unaware of being observed because they were busy. The medical students sensed the tensions and described adopting a friendly persona to gain practitioners’ confidence and assert their position as helpful observers. Some students eased tensions by reminding observed clinical staff that they were ultimately looking at what they did as individuals but also within the systems and the environment where they were working.

Observed staff commented on the way the students introduced the work prior to commencing their observations. Some staff were not informed and unaware that observations were taking place as an exercise for patient safety and received limited information and, therefore, they felt threatened. In these situations, students had to re-explain the project’s purpose and the anonymity of the process to ease concerns. Some students defused tensions by offering informal feedback afterwards with everyone on the observed team. This, too, appeared to allay concerns. The acceptability of the observations greatly increased with clarity about the reasons behind the process.

The anonymity of observations was a strong theme. All staff were aware that observations were anonymised and valued this, but some were not convinced this was the way forward. In these instances, practitioners wanted the observer to draw attention to malpractice, either by overstepping the line of ‘observer’ by intervening in real-time in the situation, or by having permission to report the action(s) after the event. Such an approach would, of course, mean that observations had the potential to have negative consequences to individual staff members, rather than being used to identify higher-level trends across departments. It was also suggested that observations featuring good practice might support doctors training portfolios by providing specific objective examples of their work.

The use of the observation data post-collection revealed that staff wished to receive information from the observations as soon after data collection as possible due to shift patterns and the rotation of staff and for immediately learning. Those requesting personal feedback on their individual performance also asked for this immediately after the observation session. Daily briefings conducted by teams were signalled as a place for rapid feedback to be shared and in this way staff felt any required changes were more likely to be implemented. This was compared to the delayed trust information distribution in the form of emails and bulletins. Monthly team meetings were mentioned as a means of reinforcing information given during daily briefings as well as the appropriate environment for reflecting on data trends over time.

The mechanism for recording observations revealed a strong preference for the use of the electronic recording device. Students and staff who had experienced both paper and app recordings commented on their preferences. Visually, being seen with a clipboard was described as off-putting. In contrast, observers and observed staff overwhelmingly favoured the electronic device, as these were now familiar to patients and clinicians within clinical environments and, thus, both acceptable and inconspicuous.

Discussion

Since 2002, expert observers on normal flights have collected data about flight crew behaviour, as threats or errors, through an approach known as line operations safety audit (LOSA). The collective findings are fed back into practice and continue to support improvements [35, 36]. Today, these audits are conducted within a strict no-jeopardy context: in other words, flight crews are not held accountable for their actions or errors that are observed [43]. The observers know the procedures and checks thoroughly. During flights, the observer records how flight crews manage these errors and specific human behaviours associated with accidents and incidents. LOSA can be used at any time and the deidentified findings result in learning and efforts to improve performance. In seeking to offer medical students deeper insights into safe practice, we set out to design observational learning, adapting the aviation LOSA methodology for students to experience the complexity of being a member of a healthcare team in an acute hospital.

Our students were initially tasked with looking for poor practice but told us they were also drawn to identify good practice. The final product, a recording app entitled Healthcare Team Observations for Patient Safety (HTOPS), has the potential to pick up ‘light noise’, what is actually happening in real time, using an anonymous feedback system to record poor and excellent practice. The app presents a novel, non-threatening mechanism to identify low-grade risks to patient safety, while providing active learning on patient safety for medical students. We have evidence that students left their special study module with richer and deeper appreciations of everyday human foibles and weaknesses and possibilities for excellence, as soon to be members of care teams.

Teaching tomorrow’s practitioners about safe practice, despite helpful directives [25, 26], remains daunting. There are many social and psychological theories on human behaviour [44,45,46,47,48], challenges for whole cohort engagement with Quality Improvement techniques [49] and huge amounts of time and resources can be spent on poorly constructed simulations [50]. Within the undergraduate curriculum, we are experimenting with how to distil complex concepts and help inexperienced undergraduates appreciate the expansive levels of knowledge on patient safety. Many have perceived that students and junior staff can become the eyes and ears of an organisation, but as yet we have not harnessed this or considered using this as a teaching method, which enables students to become partners in propelling and contributing to optimal practice.

Our first challenge was to apply aviation observation methods to student learning within the complex systems of healthcare delivery. We sought to avoid designing yet another self-reported measurement tool, as there is a strong acknowledgement that healthcare staff feel overburdened with form-filling tools and yet require a voice [51, 52]. Using participatory action research, supported by a steering group with wide representation, ensured that our cyclical data was debated and discussed so that we moved from aviation thinking to healthcare thinking iteratively. Student willingness to learn more about patient safety using the medical school special study route yielded active participants, while the local acute hospital was more than willing to engage as partners in the project. The direction of travel was aided by an IT technologist seeking a simple yet workable solution, which required several paper prototypes before being applied to an app. The final product was found to be quick and easy to use and students could quickly enter and code both positive and negative observed patient safety behaviours.

Acceptability within clinical areas for student observations, despite team consent to be part of the research, proved challenging. Some clinical practitioners welcomed being observed, while others were sceptical. Expecting qualifying students to explain their presence to unprepared seniors brought some concerns, although the students appeared to be confident to show and share the potential of the actions. Anonymity was paramount to acceptance because of avoiding a blaming of others culture, though some interview participants noted that this could limit the usefulness of feedback in making changes. Overcoming cultural challenges in healthcare and moving from the status quo remains a concern for patient safety leaders and will apply here, as ‘trust’ with this process remains paramount. There was a desire for instant feedback. This was because several leading clinical nurses and doctors could perceive this as a vehicle to help them advance good practice, not only because it picked up ‘light noise’ but also because it could highlight positive actions.

The final thematic analysis resonates with others who have tried to map the breadth and depth of contributory factors for patient safety [4]. At this stage, we do not claim this is complete and the iterative nature of this development allows for on-going development and refinement. An essential responsibility for the usefulness of the intervention, however, lies with the observer as they can write and clarify what they observe, offering not just factors but real-time stories revealing more about the context. The next steps require further studies to (i) confirm the app is complete and workable, (ii) develop a training and learning event for all observers, (iii) explore the potential integration into everyday practice in clinical areas, and (iv) affirm whether this changes practice and whether areas where possible poor practice is repeatedly identified do reflect and take this feedback on board. Finally, (v) we need to confirm whether practitioners trust and are willing to invest in this method. It is possible to expand the observer role to include qualifying nurses and allied health practitioners and to qualified staff. The advantage here is that becoming an observer as a student appears to heighten the desire to be a good practitioner. Being a qualified observer, therefore, could offer a chance for more senior practitioners to learn about and deepen their understanding of the theory of human behaviour in groups and human fallibilities. The concern is that deploying qualified staff as observers may evoke more suspicion amongst those being observed and that the methods would lose some of the naivety because of normalisation. On the other hand, this might be better for speciality-specific safe practice.

This early pilot study has limitations. The small set of students were a self-selecting group of final-year students who were keen to know more about patient safety. It is difficult to distinguish between their desire to increase their knowledge and actual commitment to observe seniors in real-time clinical situations. Despite this, three different groups of students all engaged with the project and produced comparable results. Much depended upon the thematic analysis of what was observed as students went onto the wards, and the final app will require further work and further testing to reach a more complete set of possible activities to be scored. Interviews with students and front-line staff were often compromised through participant availability within busy student and clinical working schedules.

Conclusion

Patient safety remains a crucial challenge for modern healthcare delivery. Retrospective analysis of what goes wrong is now being aligned with real-time considerations for adaptive practice that ensures forward thinking [16]. HTOPS offers a testable approach for learning. The process requires further study but this pilot data offers possible solutions not just for highlighting light noise but for deeper learning about safety because the process has the potential to link theory to practice.

Availability of data and materials

Availability of data and materials are in the main shown in this document as they are contained within the tables. The tables summarise the paper reports of the medical students which are stored at the University of Leicester in a locked cupboard and are available for reference on request. The final set of data is available on the app on request for which we share a summary in the paper.

References

  1. World Health Organisation. Making healthcare Safer. Geneva: WHO Patient Safety and Risk Management. https://apps.who.int/iris/bitstream/handle/10665/255507/WHO-HIS-SDS-2017.11-eng.pdf;jsessionid=610C63F83825A4DD6E61AF14D9032079?sequence=1. Accessed 17 July.

  2. Russ A, Fairbanks RJ, Karsh B, Militello LG, Saleem JJ, Wears RL. The science of human factors: separating fact from fiction. BMJ Qual Saf. 2013;22:802–8. https://doi.org/10.1136/bmjqs-2012-001450.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Waterson P, Catchpole W. Human factors in healthcare: welcome progress, but still scratching the surface. BMJ Qual Saf. 2016;25:480–4. https://doi.org/10.1136/bmjqs-2015-005074.

    Article  PubMed  Google Scholar 

  4. Lawton R, Taylor N, Clay-Williams R, Braithwaite J. Positive deviance: a different approach to achieving patient safety. BMJ Qual Saf. 2014;23:880–3. https://doi.org/10.1136/bmjqs-2014-003115.

    Article  PubMed  PubMed Central  Google Scholar 

  5. James JT. A new, evidence-based estimate of patient harms associated with hospital care. J Patient Saf. 2013;9(3):122–8. https://doi.org/10.1097/PTS.0b013e3182948a69.

    Article  PubMed  Google Scholar 

  6. Pronovost PJ, Cleeman JI, Wright D, Srinivasan A. Fifteen years after to err is human: a success story to learn from. BMJ Qual Saf. 2016 June;25(6):396–9. https://doi.org/10.1136/bmjqs-2015-004720.

    Article  PubMed  Google Scholar 

  7. Macrae C. The problem with incident reporting. BMJ Qual Saf. 2016;25:71–5.

    Article  Google Scholar 

  8. Health Quality Ontario. Patient Safety Learning Systems: a systematic review and qualitative synthesis. Ont Health Technol Assess Ser. 2017;17(3):1–23.

    Google Scholar 

  9. National Health Service Improvement. NRLS National Patient Safety Incident Reports: commentary. London: NHS Improvement; 2018. [accessed 14th June 2021: https://www.england.nhs.uk/patient-safety/national-patient-safety-incident-reports/

    Google Scholar 

  10. Ghandi TK, Kaplan GS, Leape L, Berwick DM, Edgman-Levitan S, Edmondson A, et al. Transforming concepts in patient safety: a progress report. BMJ Qual Saf. 2018;27:1019–26. https://doi.org/10.1136/bmjqs-2017-007756.

    Article  Google Scholar 

  11. Mukammel DB, Header SF, Weimer DL. Top-down and bottom-up approaches to health care quality: the impacts of regulation and report cards. Annula Rev Publ Health. 2014;35:477–97. https://doi.org/10.1146/annurev-publhealth-082313-115826.

    Article  Google Scholar 

  12. Martin GP, McKee L, Dixon-Woods M. Beyond Metrics? Utilizing ‘soft intelligence’ for healthcare quality and safety. Soc Sci Med. 2015;142:19–26. https://doi.org/10.1016/j.socscimed.2015.07.027.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Dixon-Woods M, Baker R, Charles K, et al. Culture and behaviour in the English National Health Service: overview of lessons from a large multimethod study. BMJ Qual Saf. 2015;23:106–15. https://doi.org/10.1136/bmjqs-2013-002471.

    Article  Google Scholar 

  14. Collins SA, Couture B, DeBord SA, Gershanik E, Lilley E, Chang F, et al. Mixed-methods evaluation of real-time safety reporting by hopistalised patients and their care partners: The MySafeCare application. J Patient Saf. 2020;16(2):e75–81.

    Article  Google Scholar 

  15. Armitage N, Brewster L, Tarrant C, Dixon R, Willars J, Power M, et al. Taking the heat or taking the temperature? A qualitative study of a large –scale exercise in seeking to measure for improvement not blame. Soc Sci Med. 2018;198:157–64. https://doi.org/10.1016/j.socscimed.2017.12.033.

    Article  Google Scholar 

  16. Hollnagel E. Safety-I and Safety-II. Farnham: Ashgate; 2014.

    Google Scholar 

  17. Woodward S. Moving toward a Safety II approach. J Patient Saf Risk Manag. 2019;24(3):96–9.

    Article  Google Scholar 

  18. Mid Staffordshire NHS Foundation Trust Public Inquiry. Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry. Vol. 1. London: The Stationery Office; 2013. Accessed 14th June 2021: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/279124/0947.pdf

    Google Scholar 

  19. Keogh B. Review into the quality of care and treatment provided by 14 hospital trusts in England: overview report. London: NHS; 2013. [Accessed 14th June 2021] https://www.nhs.uk/nhsengland/bruce-keogh-review/documents/outcomes/keogh-review-final-report.pdf

    Google Scholar 

  20. Ramanuj PP, Ryland H, Mitchell EW, Parvizi N, Chinthapalli K. In the spotlight: healthcare inspections as an opportunity for trainee clinicians to be the leaders of today. BMJ Qual Saf. 2014;23(8):706–8. https://doi.org/10.1136/bmjqs-2013-002534.

    Article  Google Scholar 

  21. Ladden MD, Bednash G, Steves DP, N Moore GP. Educating interprofessional learners for quality, safety and systems improvement. J Inter Care. 2006;20(5):497–505. https://doi.org/10.1080/13561820600935543.

    Article  Google Scholar 

  22. Health Education England. Improving safety through education and training. Report by The commission in education and training for patient safety. 2016 [Accessed 14th June 2021]. https://www.hee.nhs.uk/sites/default/files/documents/Improving%20safety%20through%20education%20and%20training.pdf.

  23. Patient Safety Learning. A patient safety future. A patient safety learning green paper. London: Patient Safety Learning; 2018. http://www.patientsafetylearning.org/

    Google Scholar 

  24. General Medical Council. Outcomes for graduates. London: GMC; 2018. [Accessed 14th June 2021] https://www.gmc-uk.org/-/media/documents/dc11326-outcomes-for-graduates-2018_pdf-75040796.pdf

    Google Scholar 

  25. World Health Organisation. Patient Safety Curriculum guide. Multi-professional edition. Geneva: WHO; 2011. [Accessed 14th June 2021]. https://apps.who.int/iris/bitstream/handle/10665/44641/9789241501958_eng.pdf;jsessionid=9264ED2EB89467DB227F2A7611D2C668?sequence=1

    Google Scholar 

  26. World Health Organisation. Patient Safety Curriculum guide for medical students. Geneva: WHO; 2009. [Accessed 14th June 2021] https://www.who.int/patientsafety/education/curriculum/who_mc_foreword-contents.pdf

    Google Scholar 

  27. General Medical Council. First, do no harm. Enhancing patient safety teaching in undergraduate medical education. London: GMC and Medical Schools Council; 2015. [Accessed 14th June 2021]. https://www.gmc-uk.org/-/media/documents/First_do_no_harm_patient_safety_in_undergrad_education_FINAL.pdf_62483215.pdf

    Google Scholar 

  28. Batchelor A, Anderson E. Defining patient safety: a student perspective. Med Sci Educ. 2019;29:399–408. https://doi.org/10.1007/s40670-018-00690-1.

    Article  Google Scholar 

  29. Goldie J, Dowie A, Goldie A, Cotton P, Morrison J. What makes a good clinical students and teacher? An exploratory study. BMC Med Educ. 2015;15(40):2–8. https://doi.org/10.1186/s12909-015-0314-5.

    Article  Google Scholar 

  30. Killam, et al. Unsafe clinical practices as perceived by final year baccalaureate nursing students: Q methodology. BMC Nurs. 2012;11(26):2–13. https://doi.org/10.1186/1472-6955-11-26.

    Article  Google Scholar 

  31. Heinrichs WL, Le Roy EB, Dev P. SBAR ‘Flattens the hierarchy’ among caregivers. Stud Health Tech Informatics. 2012;173:175–82. https://doi.org/10.3233/978-1-61499-022-2-175.

    Article  Google Scholar 

  32. Dekker SD. The field guide to understanding ‘human error’. Farnham Surrey: Ashage; 2014.

    Google Scholar 

  33. Klinect, J.R., Murray, P., Merritt, A. & Helmreich, R. (2003). Line Operations Safety Audit (LOSA): Definition and operating characteristics. In Proceedings of the 12th International Symposium on Aviation Psychology (pp. 663-668). Dayton, OH: The Ohio State University. [Accessed 14th June 2021]. https://www.faa.gov/about/initiatives/maintenance_hf/losa/publications/media/klinect_operatingcharacteristics2003.pdf

  34. Federal Aviation Administration. Advisory circular 120-90. Line Operations Safety Audits. Washington DC: Federal Aviation Administration; 2006. [Accessed 14th June 2021]. https://www.faa.gov/regulations_policies/advisory_circulars/index.cfm/go/document.information/documentID/22478

    Google Scholar 

  35. International Civil Aviation Organisation. Manuel of evidence-based training. Montreal: ICAO; 2013. [Accessed 14th June 2021]. https://skybrary.aero/bookshelf/books/3177.pdf

    Google Scholar 

  36. Tesmer B. LOSA Programme stimulates change that improves safety in line operations. Intern Civil Aviation Org J. 2002;57(4):13.

    Google Scholar 

  37. Dekker SWA, Hugh TB. A just culture after mid-staffordshire. BMJ Qual Saf. 2014;23:356–8 https://qualitysafety.bmj.com/content/23/5/356.

    Article  Google Scholar 

  38. Neuhaus C, Hofer S, Hofmann G, Wächter C, Weigand MA, Lichtenstern C. Perioperative safety: learning, not taking, from aviation. Anesth Analg. 2016;122(6):2059–63. https://doi.org/10.1213/ANE.0000000000001315.

    Article  PubMed  Google Scholar 

  39. Kuper A, Reeves S, Levinson W. An introduction to reading and appraising qualitative research. BMJ. 2008;337:a288. https://doi.org/10.1136/bmj.a288.

    Article  PubMed  Google Scholar 

  40. Meyer J. Action research. In: PopeC MN, editor. Qualitative research in health care. 3rd ed. Malden: Blackwell Publishing; 2006. p. 121–42.

    Chapter  Google Scholar 

  41. McTaggart R. Reflection on the purposes of research, action, and scholarship: a case of cross-cultural participatory action research. Syst Pract Action Res. 1999;12:493–511. https://doi.org/10.1023/A:1022417623393.

    Article  Google Scholar 

  42. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;39(2):77–101. https://doi.org/10.1191/1478088706QP063OA.

    Article  Google Scholar 

  43. Kani G, Anca J, Chidester TR. Crew resource management. London: Academic Press Elsevier. p. 354.https://books.google.co.uk/books?id=Xg2GDwAAQBAJ&pg=PA354&lpg=PA354#v=onepage&q&f=false.

  44. Hall P. Interprofessional teamwork: professional cultures as barriers. J Intepro Care. 2005;(Supplement 1):188–96. https://doi.org/10.1080/13561820500081745.

  45. Reeves S, Suter E, Goldman J, Martimianakis T, Chatalalsingh C, Dematteo D. A scoping review to identify organizational and education theories relevant for interprofessional practice and education. Calgary: Calgary Health Region; 2007.

    Google Scholar 

  46. Holden RJ. People or systems? To blame is human. The fix is to engineer. Prof Saf. 2009;54(12):34–41 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3115647/.

    PubMed  PubMed Central  Google Scholar 

  47. Cordingley L, Peters S, Hart J, Rock J, Hodges L, Mc Kendree J, et al. What psychology do medical students need to know? An evidence based approach to curriculum development. Health Soc Care Ed. 2(2):38–47. https://doi.org/10.11120/hsce.2013.00029.

  48. Thomson K, Outram S, Gilligan C, Levett-Jones T. Interprofessional experiences of recent healthcare graduates: a social psychology perspectives on the barriers to effective communication, teamwork and patient-centred care. J Interprof Care. 29:624–40. https://doi.org/10.3109/13561820.2015.1040873.

  49. Radenkovic D, Mackenzie R, Bracke S, Mundy A, Craig D, Gill D, et al. Involving medical students in service improvement: evaluation of a student-led, extracurricular, multidisciplinary quality improvement initiative. Adv Med Educ Pract. 2019;10:781–93. https://doi.org/10.2147/AMEP.S210311.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Anderson ES, Bennett S. Taking a closer look at undergraduate acute care interprofessional simulations: lessons learnt. J Interprof Care. 2020;34(6):772–83. https://doi.org/10.1080/13561820.2019.1676705.

    Article  PubMed  Google Scholar 

  51. Hempel S, O’Hanlon C, Lim YW, Danz M, Larkin J, Rubenstein L. Spread tools: a systematic review of components, uptake, and effectiveness of quality improvement toolkits. BMC. 2019. https://link.springer.com/article/10.1186/s13012-019-0929-8.

  52. Morrison EW. Employee voice behavior: integration and directions for future research. Acad Manag Ann. 2011;5:373–412.

    Article  Google Scholar 

Download references

Acknowledgements

We wish to thank University Hospitals of Leicester NHS Trust for their support in this project and particularly Moira Durbridge Director of Quality Transformation and Efficiency Improvement, the Medical Director Mr Andrew Furlong and Claire Rudkin Head of Patient Safety. We also wish to thank pilot Captain Colin Adair an aviation human factors trainer.

Funding

This work was funded by the Wellcome Trust via the University translational funds. Graham Martin’s contribution was supported by The Healthcare Improvement Studies Institute (THIS Institute), University of Cambridge. THIS Institute is supported by the Health Foundation, an independent charity committed to bringing about better health and healthcare for people in the UK.

Author information

Authors and Affiliations

Authors

Contributions

ESA conceptualised the project and analysed and read all the student observations, and TRLG read and analysed all the student paper observations. TF designed the app and helped ES and TRLG import all the data into the app in the final cycle of testing. FW conducted the interviews with staff and students and with ESA analysed the qualitative data. RIN and GM were steering group members and analysed the project development throughout. All authors contributed to the writing and read and approved the final manuscript.

Corresponding author

Correspondence to E. S. Anderson.

Ethics declarations

Ethics approval and consent to participate

This paper has ethical permission granted from the University of Leicester and this is reported in the paper (7741-esa1-medicaleducation). This paper does not have any animal or human data or tissues (‘Not applicable’).

Consent for publication

Not applicable. We have no concerns for personal data in this paper.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Anderson, E.S., Griffiths, T.R.L., Forey, T. et al. Developing Healthcare Team Observations for Patient Safety (HTOPS): senior medical students capture everyday clinical moments. Pilot Feasibility Stud 7, 164 (2021). https://doi.org/10.1186/s40814-021-00891-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s40814-021-00891-3

Keywords