Standardization and Reliability



Case 16-1





| Print

Case 16-1:




INADEQUATE STERILIZATION OF SURGICAL INSTRUMENTS


Midway through the wound debridement, the scrub nurse noted that the sterilization indicators had not changed colors—the surgeon was operating with instruments that had not been properly sterilized. The subsequent root cause analysis revealed that the sterile processing technician, at the end of his shift, forgot to push the button to start the autoclave. The arriving technician on the next shift assumed the autoclave had finished the cycle, and not noticing that the sterilization indicator on the cart had not changed color, removed the cart with the unsterilized trays and placed them on the shelf for use.







Introduction





In 1999, the Institute of Medicine (IOM) highlighted two studies from the 1980s that suggested between 44,000 and 98,000 patients die every year due to preventable medical errors.






The subsequent IOM report, Crossing the Quality Chasm, noted, “The current systems cannot do the job. Changing systems of care will.” The report went on further to describe the six aims of safety, effectiveness, efficiency, patient-centeredness, timeliness, and equity. With these aims, the IOM has defined the ultimate vision for the U.S. health care system.






The limitations of the current health care system were further highlighted by Elizabeth McGlynn’s study in 2003. In that study, her group found that patients only receive 55% of the care warranted by medical evidence. Furthermore, they found that the likelihood that an individual patient would receive all appropriate care was 2.5%.






Human Factors





The Individual



A main contributor to the performance shortfall is the limitation of human performance. Table 16-1 shows expected human error rates in conditions under no undue time pressure or stress. Note that “under very high stress when dangerous activities are occurring rapidly,” the error rate can be as high as one in four. Therefore, system designs that depend on perfect human performance are destined to fail. Furthermore, systems designed to function in conditions of high stress with frequent dangerous activities have a higher burden in order to ensure a favorable outcome.




Table 16-1 Nominal Human Error Rates for Selected Activities 



As defined by the Federal Aviation Administration, “Within the FAA, human factors entail a multidisciplinary effort to generate and compile information about human capabilities and limitations and apply that information to equipment, systems, facilities, procedures, jobs, environments, training, staffing, and personnel management for safe, comfortable, and effective human performance.”



When accounting for human factors it is helpful to consider the human and the system separately. Reliable systems must compensate for the limitations of human performance. In addition, organizational characteristics can negatively or positively impact human performance.



| Print

Practice Point





  • Organizational characteristics can negatively or positively impact human performance. When accounting for human factors, it is helpful to consider the human and the system separately.



When redesigning systems to improve performance and reduce adverse events, hospitalists should recognize the factors that may negatively impact human performance so that the design can account for the expected vulnerability.



There are limitations to human memory. On the average, a person is able to keep seven, plus or minus two, elements in short-term memory. A frequent system vulnerability is the reliance on human memory to retrieve key information at the time it is needed. In reliable systems, key information is made available at those key times, rather than relying on human memory.



Rushed people cut corners. Over time, the repeated short cuts result in a narrowing safety margin. The natural tendency to cut corners and the repeated experience of no negative outcome reassures the individual that they remain within an appropriate level of safety, or reliability. This is described as “normalization of deviance.”



A glaring example of normalization of deviance was the1986 shuttle Challenger explosion 72 seconds after lift-off. The subsequent investigation found that the cause of the explosion was the failure of the O-rings that were part of the rocket engines. On previous launches there was evidence of damage to the O-rings. The following quote from the “Report of the Presidential Commission on the Space Shuttle Challenger Accident” illustrates how normalization of deviance led to the disaster. NASA and Morton Thiokol accepted escalating risk apparently because they “got away with it last time. As Commissioner Richard Feynman observed, the decision making was




  • a kind of Russian roulette…. [The Shuttle] flies [with O-ring erosion] and nothing happens. Then it is suggested, therefore, that the risk is no longer so high for the next flights. We can lower our standards a little bit because we got away with it last time… You got away with it, but it shouldn’t be done over and over again like that.



Normalization of deviance occurs because of the natural human tendency to slip into believing that despite short cuts, adequate safety or reliability margins remain. In health care, normalization of deviance is often a barrier when trying to implement a standard checklist for the insertion of a central line, an intentional pause with completion of a checklist before a procedure to ensure safety or other quality and safety interventions.



Stress impacts human performance by causing tunnel vision and filtering, selectively disregarding what is believed to be irrelevant information. This causes a loss of pattern recognition that humans use to rapidly discern complex situations. Fatigue negatively impacts human performance by impacting both short-term and long-term memory. The impact is similar to having a blood alcohol level of 0.1%. Other factors that commonly degrade health care worker performance include multitasking, interruptions, and environmental factors.






The Organization



James Reason described characteristics that impact an organization’s capacity to support or impede an individual’s ability to function reliably. Human error can be addressed from an organizational perspective using the “person approach” or the “system approach.”



The person approach focuses on the actions of the frontline staff who commit errors. The errors, it is believed, are due to flawed mental processes that can be voluntarily corrected with enough motivation, attention, and vigilance. The institutional response is focused on correcting the variation in human behavior. Frequently, the responses engender fear of disciplinary measures, threat of litigation, retraining, naming, blaming, and shaming, so that the individual will focus more intently on the task at hand and not make a similar error. Often, new policies and procedures are written to ensure the correct behavior. In short, the person approach implicitly assumes that bad things happen to bad people.



The fundamental premise of the system approach is the anticipation of human fallibility. Human errors are to be expected. The errors are seen as consequences of an inadequate system design. It is believed that most errors occur because system barriers and defenses that are “upstream” lead to undesired outcomes.



| Print

Practice Point





  • The fundamental premise of the “system approach” is the anticipation of human fallibility. Human errors are to be expected. The errors are seen as consequences of an inadequate system design. It is believed that most errors occur because system barriers and defenses that are “upstream” lead to undesired outcomes. Reliable systems must compensate for the limitations of human performance.



The person approach is somewhat appealing on several levels. It is emotionally satisfying to blame an individual for an adverse event. In addition, divorcing the unsafe act from the organization is clearly in the best interests of the managers, but these benefits come at a great cost—the willingness of staff to report errors. Contrast this with the experience that in 90% of aviation maintenance mishaps, the worker is found blameless. In order to improve, it is important to perform a detailed analysis of incidents and near misses. Without a reporting culture, this information never surfaces (see Chapter 7).



Conversely, the system approach recognizes human fallibility and system designs are successful despite human error. In reliable organizations, admission of errors and near misses is reinforced. Leadership in reliable organizations realizes that early detection of latent conditions that promote human error is essential in creating reliable systems.

Only gold members can continue reading. Log In or Register to continue

Jun 13, 2016 | Posted by in CRITICAL CARE | Comments Off on Standardization and Reliability

Full access? Get Clinical Tree

Get Clinical Tree app for offline access