How do we improve patient safety? A look at the issues and an interview with Dr. Britt

In his welcoming remarks at the 40th annual meeting of the Association of Veterans Administration Surgeons (AVAS) in April 2016, L. D. Britt, MD, MPH, DSc(Hon), FACS, FCCM, FRCSEng(Hon), FRCSEd(Hon), FWACS(Hon), FRCSI(Hon), FCS(SA)(Hon), FRCSGlasg(Hon), Past-President of the American College of Surgeons (ACS), expressed concern that surgeons have made little progress in the prevention of adverse events in patient care over the last decade. “There still remain documented challenges with respect to patient safety and adverse events, including ‘never events.’ For example, a foreign object is left inside a patient 39 times per week, and wrong procedure/wrong site surgeries occur more than 40 times per week,” said Dr. Britt, Henry Ford Professor and Edward J. Brickhouse Chair, department of surgery, Eastern Virginia Medical School, Norfolk, and a member of The Joint Commission. With that assertion, Dr. Britt issued a challenge to the AVAS to address the prevention of adverse events, adding that the U.S. Department of Veterans Affairs (VA) is an ideal environment to test patient safety models that can then be applied in the private sector.

This article examines the issue of patient safety and specifically focuses on the following:

  • Persistence of patient safety issues in the private sector
  • Barriers to implementing patient safety initiatives
  • Efforts to apply the high-reliability organization crew resource management (CRM) models in health care
  • The pitfalls and benefits of checklists

This article concludes with an interview that Dr. Lipshy conducted with Dr. Britt to receive further insights from a noted patient advocate and surgical leader.

Persistence of patient safety issues

Since the publication of the Institute of Medicine (now the National Academy of Medicine) report, To Err Is Human: Building a Safer Health System, in 2000, many health policy experts have continued to point to problems in patient safety and the slow progress that has been made toward their reduction or elimination.1 As recently as May 2016, the British Medical Journal (BMJ) published an article claiming that medical error is the third-leading cause of death in the U.S.2

While many private sector hospitals have encountered difficulty in improving patient safety, the VA has had some success in this area. One of the first VA studies to assess adverse events in patient care was published in 2001. This report stated that “almost a quarter (22.7 percent) of active care patient deaths were rated as at least possibly preventable by optimal care,” with 6.0 percent rated as “probably or definitely preventable.”3 Both estimates are notably lower than the 53 percent of adverse events identified as preventable in To Err Is Human.1 To further clarify the implications of their research, the authors conceded that “after considering three-month prognosis and adjusting for the variability and skewness of reviewers’ ratings, clinicians estimated that only 0.5 percent of patients who died would have lived three months or more in good cognitive health if care had been optimal, representing roughly 1 patient per 10,000 admissions to the study hospitals.”3,4

Subsequently, the VA National Center for Patient Safety (NCPS) produced reports demonstrating progress and providing insight into preventable adverse events. Neily’s 2009 article assessed the effect of the 2003 VA Ensuring Correct Surgery and Invasive Procedures (ECSIP) directive on incorrect surgical procedures both in and out of the operating room (OR) from 2001 to 2006.4 Paradoxically, after implementation of the protocols noted in the VA ECSIP guidance, the number of reported events increased. Individuals familiar with VA ECSIP indicated that an increase in reporting rather than a rise in incidents likely occurred (in-person communication at ACS Clinical Congress, October 2010). Neily noted that communication problems were the cause of approximately one-quarter of the adverse events. Human factors, time-out issues, training issues, and non-standardized protocols led to most of the other incidents. Another important observation was that “events occurred about equally in and out of the OR.” The team concluded that “time-out procedures alone have not been enough to prevent incorrect surgical and invasive procedures. We advocate earlier improved communication based on crew resource management (CRM) principles.”4

A 2011 follow-up article published in the Journal of the American Medical Association: Surgery showed a decrease in reported adverse events from 2001 to 2009. The group proposed that implementation and training on the ECSIP directive, as well as clinical team training, likely contributed to this change.5

Although the BMJ article created considerable controversy—mostly because of questions regarding the validity of the data—the overall message of the article resonated: physicians need to take charge of ensuring patient safety. In the article, the authors proposed several strategies that health care professionals can implement to reduce preventable deaths in health care:2

  • Improve transparency, so that when errors occur, their effects can be intercepted. Achieving this goal requires ending the culture of “shame and blame,” which pervades many health care institutions and often inhibits staff from reporting problems.
  • Identify strategies to prevent vulnerabilities in the system from placing patients at risk of complications. This part of the process would require upfront discussion of potential issues that could lead to mistakes, the consequences thereof, and approaches to managing errors when they occur.
  • Follow principles that take human limitations into account. Make it easier for people to provide safe care by identifying potential failure points—situations where the safety system fails and errors occur—and help them to compensate for their shortcomings.

Dr. Britt’s comments at the AVAS meeting and the BMJ article provided an excellent backdrop to the August 2016 National Surgical Patient Safety Summit (NSPSS), in Rosemont, IL, which was sponsored by the ACS and the American Academy of Orthopaedic Surgeons (AAOS). NSPSS participants expressed concerns about the effectiveness of patient safety strategies and discussed the following potential reasons why patient safety initiatives have fallen short of expectations:

  • Minimal support from leadership due to lack of staff buy-in, with physicians typically remaining the key barrier to implementation
  • Individual misconduct and noncompliance
  • Failure to understand that individual failure is as important as systems failures
  • Communication failures
  • Ongoing attempts to apply quality improvement models, which may be unsuitable in the complex health care environment
  • Physician concerns about loss of autonomy and being required to adhere to pre-established standards
  • Potential distractions created through the use of information technology and the electronic health record (EHR)
  • Questions about the concept of  “zero harm”
  • An overemphasis on data and dogma without due consideration for context of events

Barriers to implementing patient safety initiatives

The list of potential reasons why patient safety mechanisms fail warrants further examination. A description of each barrier is as follows.

Failure to attain staff buy-in

One major hurdle for patient safety initiative implementation has been lack of staff physician buy-in. Some health care professionals question the value of the checklists that are commonly used to ensure that the operating room (OR) team take standard precautions during surgical procedures. Other staff suffer from “checkbox fatigue” and simply may go through the motions without really thinking about the effect of their actions on the individual patient.6 In a review of the implementation of the World Health Organization (WHO) Checklist in the U.K., Russ and colleagues concluded successful implementation required the following steps:7

  • Modification to suit the local context
  • Education tailored to the needs of each stakeholder
  • Identification of local champions to promote safety in clinical work areas
  • Executive leadership support
  • A system of accountability for “improper” behavior or noncompliance
  • Careful auditing

Individual misconduct and noncompliance

Despite the fact that the most critical adverse events occur in the OR, surgeons and anesthesia professionals frequently prove to be the most averse to patient safety initiatives. In 2015, for example, Russ and colleagues noted that “the most common barrier to checklist implementation, reported by 51 percent (61 of 119) of the sample, was active resistance or passive noncompliance from individuals on the OR team, most frequently (84 percent) from senior surgeons and/or anesthesiologists. This often made it challenging for the person leading the checks (often a nurse) to complete them in the intended manner, or without feeling personally attacked.” Often only after something untoward occurs do surgeons take an interest.7

This opposition perhaps should be expected, given that safety measures frequently are developed without input from the surgical teams expected to use them. Surgeons question the need for checklists in particular, claiming they can disrupt the flow of care, reduce efficiency, and have a minimal impact on improvements in care. Similarly, pilots were initially distrustful of cockpit management training initiatives, but now see the benefits of safety checklists and use them as standard operating procedure (personal communication via telephone, Alan Diehl, PhD, author of Air Safety Investigators: Using Science to Save Lives—One Crash at a Time, August 23, 2013). So the question becomes, how do we convince surgeons and other health care professionals that compliance with patient safety initiatives has a meaningful impact on patient outcomes?

Hospitals and patient safety advocates point to studies outside the U.S. and within the VA that show reductions in mortality following implementation of surgical safety programs.5,8 Sara J. Singer, professor of health care management and policy, department of health policy and management, Harvard T. H. Chan School of Public Health, Boston, MA, noted that when the Safe Surgery 2015: South Carolina (SSSC) initiative was launched, 38 of 67 South Carolina hospitals completed the pre-implementation survey. Of those 38 hospitals, 13 indicated that they had fully completed the SSSC program. The study concluded that hospitals that successfully participated in the South Carolina Surgical Safety Checklist project experienced an improvement in quality of care.9

Understanding individual failure is as important as overcoming systemic failure

Atul Gawande, MD, MPH, FACS, professor of surgery, Harvard Medical School; professor, Harvard School of Public Health; general surgeon, Brigham and Women’s Hospital; and executive director, Ariadne Labs, Boston, noted that “fallibility comes from both failure due to ignorance and failure due to ‘ineptitude,’ the latter being failure to deliver on existing knowledge. Both can occur in surgery, but failure to deliver plays a substantial part” (personal communication at meeting, December 31, 2012).10,11

A study led by Peter J. Fabri, MD, PhD, FACS, surgeon and professor, industrial engineering, University of South Florida, Tampa, in 2008 reviewed 9,830 patient procedures and reported that 78.3 percent of surgical complications were related to medical error. In 75 percent of those cases, the error contributed to more than 50 percent of the outcome. The patient either died or suffered permanent injury in 25 percent of the cases. In contrast to many other publications on this topic, system and communication issues combined contributed to only 4 percent of the complications—the rest were due to human error. In cases with human error, 63.5 percent were due to an “error of technique,” 20 percent to a “mistake” (doing the wrong thing), and 58 percent to a “slip” (doing the right thing incorrectly). Errors in judgment were reported in 29.6 percent, inattention to detail in 29.3 percent, and incomplete understanding of the problem in 22.7 percent of the cases studied (personal communication via telephone, June 28, 2016).12

Communication failures

Studies indicate that lack of communication and teamwork cause most preventable sentinel events. Pronovost and several other investigators have shown that health care professionals who work in high-intensity environments, such as the OR and the emergency department, have negative perceptions of physician communication and teamwork skills. Physicians tend to overrate their abilities in these areas. Pronovost noted this discrepancy when reviewing data regarding ICU nurses and physicians’ impressions of teamwork, finding that physicians rate the nurses’ collaboration skills at 90 percent, whereas nurses rated the physicians at only 54 percent.13 Until physicians accept that poor teamwork and communication contribute to adverse events, further improvements cannot occur.

Over-reliance on CRM and other HRO models may hinder progress

In 1992, Dr. Diehl reported a 36 percent to 81 percent decline in aviation accidents as a result of CRM training.14 In 2000, Helmreich and others were interested in developing a safety improvement model to reshape medicine and they chose the CRM model because of the similarities between pilots and physicians.15-17 A 2016 meta-analysis of team training in health care showed that team training reduces medical errors by only 19 percent and improves clinical performance by only 34 percent.18 If aviation and medicine are so alike, why are we failing to achieve the same results in medicine as seen in aviation when the same techniques are implemented?

Richard Karl, MD, FACS, pilot for JetSuite Irvine, CA; owner, Safety Institute; and chairman emeritus of the department of surgery, University of South Florida, stated that although medicine is far more complicated than aviation, many tools used to improve airline safety could be used to reduce error and improve outcomes in health care if implemented properly. However, “You can’t just slide a checklist under the operating room door and expect it to work,” he added (personal communication, July 2016).

Douglas E. Paull, MD, MS, Director, Patient Safety Curriculum and Medical Simulation, National Center for Patient Safety, suggested that, considering the dynamic state in which medical teams function, health care institutions should consider introducing the concept of “teaming.”19 Amy C. Edmondson, PhD, professor of leadership and management, Harvard Business School, noted health care teams are not static. Teaming accounts for the dynamic interactions between units or special teams. Teaming reflects the activity of working together as an active process. It represents a fluid network of interconnected individuals working in temporary teams to solve problems (personal communication via telephone, July 25, 2016). The members of these groups have different levels of training and competence, with various members performing different responsibilities in the course of care or a procedure. Furthermore, several leadership units are often at play in an OR, including nursing, anesthesia, and surgery, making teaming an even more realistic goal to achieve.19

One final mechanism that may lead to improved patient outcomes is the use of James Reason’s “Swiss cheese model,” or layered security approach, to detect errors in health care and other high-reliability organizations (HROs).20 Like the holes in Swiss cheese, this model suggests that in a complex system, an error may occur at one point, but the various other layers of the process provide opportunities for correction. This model, however, may be too simplistic for use in health care, where multiple cogs move simultaneously. In other words, the problem that we believe we have put into check may change the second we move on to the next task.21

Physician concerns about loss of autonomy

Lack of leadership is a key driver of sentinel and adverse events. According to Dr. Paull, “We ultimately have responsibility for our patient outcomes and experiences. Leaders set the tone for safety culture…leaders by their words and actions develop an environment that rewards people for speaking up with safety concerns.” Traditionally, surgeon leadership has meant serving as the “captain of the ship.” While the captain of the ship model may no longer be applicable in environments that emphasize patient-centered, team-based care, physicians must take the helm in overseeing the patient safety process.

The concept of supervising the surgical patient care process would suggest that surgeons should be invested in the entire patient safety venture. However, surgeons frequently disengage from conversations about nontechnical skills. Dr. Edmondson recently noted that in an earlier study on the development of minimally invasive cardiac surgery programs, success hinged on how the lead surgeon viewed his or her role as a member of the team. In the instances where the lead surgeon viewed himself or herself as part of a partnership aimed at benefiting both the team and the patient, the project succeeded. These surgeons tore down the silos and ensured everyone contributed equally to the success or failure of the project. However, when the lead surgeon used authoritarian (top-down) leadership and simply expected everyone to execute a task as directed, the project failed.21

Potential distractions created by information technology and the EHR

In 2013, Mark Chassin, MD, MPP, MPH, FACP, president and chief executive officer of The Joint Commission, noted that in HROs, information technology is considered the vehicle for achieving “nearly perfect processes.”22 The EHR, with its electronic checklists and documentation pathways, was designed to make health care safer and more efficient. However, keying data into the EHR and marking off documentation in the EHR checklist seem to have taken precedence over actual patient care. Indeed, safety documentation processes have been reported to create distractions and reduce efficiency. Dr. Paull agrees that “checklists—read and verified prior to operations—are not designed to increase a documentation burden. If created by a provider on the front line and used correctly, they are cognitive aids preventing the user from forgetting important steps/information and that should help evoke discussion and engagement among team members. The checklists we promote are simply tools/artifacts that help facilitate teamwork and communication.”

Questions about “zero harm”

In 2013, Dr. Chassin also repeated the call for zero patient harm.22 However, some patient safety experts question whether zero patient harm is a desirable goal. A hallmark of HROs is the commitment to resilience. As noted in Weick and Sutcliffe’s work, HROs operate with the understanding that in spite of all their efforts and safety measures, they can never be completely error-free. Rather, HRO administrators strive to recognize errors early, mitigate the consequences, and prevent the organization from becoming disabled by an adverse event.23

The public and many hospital administrators are focused on zero tolerance of error or adverse events. Too often hospital committees focus on individual untoward events without regard to the context surrounding the error. As a result, they divert their focus from improving patient care processes to explaining the circumstances surrounding a single event. As Dr. Paull said, “Zero harm does not mean zero error! Key to patient safety is preventing errors from reaching patients and causing harm. Never events represent the low threshold for participation in health care. That is why there is mandatory reporting of such events in many states. Close calls and such are what HROs focus upon. No patient actually gets hurt [in these instances] but we do not want anything to happen in future” (personal communication via e-mail, July 31, 2016).

Seeing the concepts in action

Following a 2015 report regarding the persistence of wrong site spine surgery and pain intervention events, a regional spine surgeon and pain intervention specialist workgroup was established to attempt to understand the root cause(s) of this problem. The natural assumption presented to our group was that when wrong level procedures occurred, there must have been willful violation of known precautions. The specialists reiterated that, in their experience, this did not seem to be the case. Their opinion was that, in most cases, the universal protocol was likely followed throughout the procedure and that other factors were at play. The participants recommended that physicians take the lead in education regarding the effects of the following factors on error:24

  • Distraction and fatigue
  • Routineness of procedure, complacency
  • Communication problems, including handoffs
  • Equipment or staff problems during localization
  • Patient characteristics: Body habitus, spinal deformities, vertebral morphological variant
  • Confirmation bias: Accepting inadequate views due to positioning in lieu of alternate/additional imaging or secondary confirmation with an additional expert

Similar to others, the participants in these calls concluded that creation of lengthy policies and checklists that do not pertain to a particular institution or team may be more likely to create bigger problems rather than solve them. Additionally, they felt that single episode/one-size-fits-all training is unlikely to have lasting effects as it may not “touch the heart and soul of the team.” They acknowledged that success requires a multi-tiered approach to overcome this hurdle.5,6,9 They also agreed that institutional leadership must be aware of the upstream and downstream interferences that occur prior to or after the OR universal protocol time-out checklist verifications.25 Finally, our surgeons expressed concerns that institutional focus specifically on the surgical team involved in an adverse event, while ignoring systemic issues, will ultimately lead to loss of motivation and subsequent burnout.

The health care industry has numerous hurdles to scale to reach the goal of improved patient safety and high reliability, but if we are unified, it is a goal within our grasp.

Disclaimer

The content of this article represents the opinions of the authors and the researchers cited herein and does not represent the opinion of the U.S. government, the U.S. Department of Veterans Affairs, or the Veterans Health Administration.


References

  1. Kohn LT, Corrigan JM, Donaldson MS (eds). To Err Is Human: Building A Safer Health System. Committee on Quality of Health Care in America, Institute of Medicine. Washington, DC: National Academy Press; 2000.
  2. Makary M, Daniel M. Medical error—the third leading cause of death in the US. BMJ. 2016;353:i2139.
  3. Hayward RA, Hofer TP. Estimating hospital deaths due to medical errors: Preventability is in the eye of the reviewer. JAMA. 2001;286(4):415-420.
  4. Neily J, Mills PD, Eldridge N, et al. Incorrect surgical procedures within and outside of the operating room. Arch Surg. 2009;144(11):1028-1034.
  5. Neily J, Mills PD, Eldridge N, et al. Incorrect surgical procedures within and outside of the operating room: A follow up report. JAMA Surg. 2011;146(11):1235-1239.
  6. Bergs J, Lambrechts F, Simons P, et al. Barriers and facilitators related to the implementation of surgical safety checklists: A systematic review of the qualitative evidence. Br Med J Qual Saf. 2015;24(12):776-786.
  7. Russ SJ, Sevdalis N, Moorthy K. A qualitative evaluation of the barriers and facilitators toward implementation of the WHO surgical safety checklist across hospitals in England: Lessons from the “Surgical Checklist Implementation Project.” Ann Surg. 2015;261(1):81-91.
  8. Bock M, Doz P, Fanolla A. A comparative effectiveness analysis of the implementation of surgical safety checklists in a tertiary care hospital. JAMA Surg. 2016;151(7):639-646.
  9. Molina G, Jiang W, Edmondson L, et al. Implementation of the surgical safety checklist in South Carolina hospitals is associated with improvement in perceived perioperative safety. J Am Coll Surg. 2016;222(5):725-736.
  10. Gawande A. The Checklist Manifesto: How To Get Things Right. 1st ed. New York, NY: Metropolitan Books; 2010.
  11. Kirby T, Gawande A. Making surgery safer worldwide. Lancet. 2010;376(9746):1045.
  12. Fabri PJ, Zayas-Castro JL. Human error, emergency communication and systems, underlies surgical complications. Surgery. 2008;144(4):557-565.
  13. Pronovost P. Intensive care unit safety reporting system (ICUSRS). Safety and medicine. Powerpoint presentation. Available at: OCW.jhsph.edu/courses/patientsafety/PDFs/PS_lec4_Pronovost.pdf. Accessed December 9, 2016.
  14. Diehl A. Does cockpit management training reduce aircrew error? Proceedings of the Twenty-Second International Seminar of the International Society of Air Safety Investigators. Canberra, Australia. November 4–7, 1991. ISASI. 1991;24(4):46.
  15. Helmreich RL. On error management. BMJ. 2000;320(7237):781-785.
  16. Helmreich RL, Davies JM. Team performance in the operating room. In: Bogner MS, ed. Human Error in Medicine. Hillside (NJ): Erlbaum; 1994:225-216.
  17. Pizzi L, Goldfarb NI, Nash DB. Crew resource management and its applications in medicine. In: Making health care safer: A critical analysis of patient safety practices. Evidence Report/Technology Assessment: Number 43. AHRQ Publication No. 01-E058, July 2001.
  18. Hughes AM, Gregory ME, Joseph DL, et al. Saving lives: A meta-analysis of team training in healthcare. J Appl Psychol. 2016;101(9):1266-1304.
  19. Edmondson AC. Teaming: How Organizations Learn, Innovate, and Compete in the Knowledge Economy. San Francisco, CA: Jossey-Bass; 2012.
  20. Reason J. Managing the Risks of Organizational Accidents. Burlington, VT: Ashgate; 1997.
  21. Cook R, Rasmussen J. “Going solid”: A model of system dynamics and consequences for patient safety. Qual SAF Health Care. 2005;14(2):130-134.
  22. Chassin MR, Loeb JM. High reliability health care: Getting there from here. Milbank Q. 2013;91(3):459-490.
  23. Weick KE, Sutcliffe KM. Managing the Unexpected: Resilient Performance In an Age of Uncertainty. San Francisco, CA: John Wiley; 2007.
  24. Lipshy KA. Human Error: Why Do We Miss What’s Right in Front of Us (What Gorilla?). In: Lipshy KA, ed. Crisis Management Leadership: Training to Survive the Critical Moment, 2nd edition. Hampton, Virginia: 2016: 21-45.
  25. Paull DE, Mazzia LM, Neily J. Errors upstream and downstream to the universal protocol associated with wrong surgery events in the Veterans Health Administration. Am J Surg. 2015;210(1):6-13.

Tagged as: , ,

Contact

Bulletin of the American College of Surgeons
633 N. Saint Clair St.
Chicago, IL 60611

Archives

Download the Bulletin App


Get it on Google Play