Medical Errors and Patient Safety

Chapter 107


Medical Errors and Patient Safetyimage



The 1999 Institute of Medicine (IOM) report To Err Is Human: Building a Safer Health System estimated that 44,000 to 98,000 patients die annually in the United States as a result of medical errors. This report showed that errors in medicine were poorly understood and infrequently investigated, largely because of the culture of hospitals and the medical profession that saw these errors as the product of human failings. This belief led to a person-focused approach to error investigation, in which strategies to reduce medical errors involved censoring or re-educating staff about proper protocols and procedures. However, it is now known that this approach deprives organizations of the greatest opportunities for meaningful learning and improvement.


Since the IOM report, health care has increasingly been adopting a systems-focused approach to medical error. In this view, nothing is intrinsically different about medical errors as compared to errors in other complex, high-risk industries such as aviation or nuclear power generation. Like medicine, these fields rely heavily on human innovation and expertise to function normally, but long ago recognized that human errors are symptoms of deeper organizational problems. Like respiratory failure or heart failure, errors in medicine demand a diagnostic search for underlying causes and systemic repairs to prevent error recurrence.


This chapter presents a framework in which to understand, investigate, and prevent errors in the intensive care unit (ICU). These principles are widely used in the field of patient safety and can be applied to other areas of health care.



Key Patient Safety Concepts and Definitions


All health care organizations display characteristics of complex adaptive systems, in that they contain groups and individuals who have the freedom to act in unpredictable ways and whose actions are interconnected. High-performing, complex organizations follow three broad rules. First, leaders give general directions by articulating values, establishing a clear organizational mission, and setting objectives. Second, resources and permissions are provided to the appropriate personnel within the organization, and they are incented to efficiently and safely fulfill patient needs. Finally, organizational constraints prevent providers from giving inefficient or unsafe care.


These three rules are expressed through organizational structures and processes. Structures are the organizational and management hierarchy, physical facilities, staffing, and capital allocated to perform a process. A process is the way humans and other resources interact to deliver patient care. Together, the structure and process creates the final products of health care, which are referred to as outcomes.


An error is a defect in process that arises from a person, the environment in which he or she works, or, most commonly, an interaction between the two. In the field of patient safety, negative outcomes are termed adverse events. Because patients may experience adverse events as a result of their underlying illnesses, preventable adverse events are differentiated from unpreventable adverse events; the former are due to error, whereas the latter are not. Errors that do not result in patient harm are termed near misses and are more common than adverse events. Safety experts understand that near miss events are equally useful to study to prevent future errors. Table 107.E1image illustrates the difference between these terms.



Errors in Complex Systems


Errors in complex systems can be divided into two types based on where they occur in the system. Active failures occur at the sharp end of a complex system, so named because it contains the people and processes that are easily identified when an adverse event occurs because of their proximity to the patient and the harm. Active failures always involve a human error, such as an act of omission (forgetting) or commission (misperceptions, slips, and mistakes). Research done in human factors has shown that the incidence of human error is predictable based on the nature of a task, the number of steps in a process, and the context in which these occur. Table 107.E2image provides examples of these types of errors. Although active failures are easiest to identify, they represent the tip of the iceberg and nearly always rest on deeper and more massive latent conditions in the organization.


Latent conditions are defects in a system that make it error-prone. Arising from the physical environment and as unintended consequences of decisions made by organizational leaders, managers, and process designers, latent conditions are the unforeseen blunt end of a system that has “set people up” to fail at the sharp end. Indeed, the majority of near misses and preventable adverse events identify multiple causative latent conditions. For example, consider an investigation of a central line–associated bloodstream infection in the ICU. Several potential latent conditions for this infection are listed in Table 107.E3.image Note that if this investigation had focused on active failures alone, it would have stopped short and blamed providers without identifying the underlying latent conditions that allowed the error or preventable infection to occur.


Latent conditions breed human error through a variety of factors. Knowledge factor impairment makes an individual’s impression of what is happening inaccurate or incomplete. An example would be an intelligent intern struggling to apply extensive prior “book learning” in the context of actual clinical practice. Excessive mental workload, fatigue, and distractions make it difficult to focus attention and maintain an accurate overview of the complex situation at hand, also known as situational awareness. One example of the latter is the difficulty that an ICU physician may have in remembering to order and follow up on coagulation factors every 6 hours for a patient on heparin while simultaneously managing multiple other critically ill patients. Impaired attention also increases the use of heuristics, which are the cognitive shortcuts we use to increase mental efficiency during times of stress. Although they may increase productivity in the short term, heuristics also increase certain types of human errors. Lastly, strategic factors force providers into difficult trade-offs between opposing objectives when time and resources are limited and risks and uncertainties abound. An example would be deciding whether or not to give the last open ICU “crash” bed to a medical/surgical floor patient who is hemodynamically stable, but difficult to manage on the floor because of observation or monitoring demands.


Figure 107.1 illustrates that on one side, the expression of human error or expertise at the sharp end is governed by the evolving demands of the situation being managed and on the other side by the organizational context in which an individual is operating. Organizational structures and culture at the blunt end of the system determine what resources and constraints people at the sharp end experience, powerfully influencing how well they will be able to use their knowledge, focus their attention, and make decisions during the course of their work (see Figure 107.1).


< div class='tao-gold-member'>

Jul 7, 2016 | Posted by in CRITICAL CARE | Comments Off on Medical Errors and Patient Safety

Full access? Get Clinical Tree

Get Clinical Tree app for offline access