Resuscitation from Shock Following Injury
Donald H. Jenkins
John B. Holcomb
Phillip A. Letourneau
Dustin L. Smoot
Stephen L. Barnes
After the initial evaluation and operative management of the surgical/trauma patient, many patients require further resuscitation, support, and care in an intensive care unit (ICU) setting. This chapter provides a brief outline of considerations, priorities, treatment algorithms, and the newest innovations that may assist any intensivist tasked with managing such critically ill surgical patients.
Statement of the Problem
Surgical patients die from shock abruptly through lack of oxygen delivery to the heart and brain, or subacutely through development of multiple organ dysfunction from late recognition of shock or inadequate resuscitation. Unlike the typical nonsurgical critically ill patient, exsanguination is often the cause of death in the surgical/trauma patient, second only to central nervous system injuries as the cause of death of trauma victims in the United States [1,2,3]. The control of hemorrhage has been identified as a priority in modern trauma patient care, second in importance only to adequate ventilation [4]. Advanced Trauma Life Support teaches a schema that incorporates the vital signs, skin color, capillary refill, and mentation to alert the physician to how severely injured the patient may be and help to quantify how much blood the patient may have lost [4]. By the time the blood pressure falls, the patient has lost 30% to 40% of his or her blood volume, or approximately 2,000 mL. This situation demands rapid action, but action should not wait until this point has been reached.
One classification system defines four types of shock: Hypovolemic (such as dehydration, diarrhea, and hemorrhage, the most common form of shock following major trauma), distributive (such as septic shock, the most common form of shock in the late phase of recovery—5 days or more—after major surgery/trauma), cardiogenic (such as from massive myocardial infarction or arrhythmia), and obstructive (such as from tension pneumothorax, pulmonary embolus, or pericardial tamponade). By far, hemorrhagic shock is the most common form following major surgery/trauma and the major focus of this chapter (although the astute physician should always keep tension pneumothorax in the differential diagnosis). Therefore, in most instances, the ICU physician faced with a surgical patient in shock should direct initial efforts toward correction of hypovolemia.
Without obvious external bleeding, vital signs and evidence of organ hypoperfusion are assessed to evaluate the patient for significant or ongoing hemorrhage. A falling hematocrit may be a sign, but as hemorrhage causes loss of cells and fluid in equal proportion, an isolated normal hematocrit should not be reassuring to the clinician. With very rapid hemorrhage, a patient can die with a normal hematocrit. A fall in central venous oxygen saturation when the cardiac output remains the same may be one of the earliest signs of hemorrhage in the ICU setting as the body begins to extract more oxygen from the remaining blood.
Physiology of Effects of Hemorrhage
The physiologic responses to hemorrhage can be broken into three categories: Hemostasis, oxygen delivery, and immunology.
Hemostasis
If bleeding does not stop, then no intervention can prevent death. It is this concept that has led to some of the most heated debates in the resuscitation literature: “Does resuscitation promote tissue perfusion and cellular metabolism, thus increasing survival, or does the increase in blood pressure destroy clot, promote rebleeding, and decrease survival?” [5]. The astute physician recognizes that both concepts are true. Cellular metabolism must be ensured, without overwhelming the clotting mechanism.
After injury, the body attempts to stop hemorrhage by clotting at the site of vascular injury. This is accomplished by the interaction of circulating clotting factors, platelets, and tissue factors from the injured cells. These factors work primarily to form a “plug” initiated by the physical presence of the platelets and augmented by the cross-linking of fibrin to form a more permanent seal. The tissue injury factors released may also lead to constriction of the local blood vessels to decrease the blood flow to the leaking area concurrently with platelet plug formation and is mediated both locally by tissue factors as well as centrally. Finally, when the blood loss leads to a fall in the blood pressure, the clotting efforts are aided by a smaller vessel diameter, decreased wall tension, and lower pressure head.
Oxygen Delivery
In 1872, Gross called shock a “rude unhinging of the machinery of life.” Although this definition is accurate, it is not precise. It is at the level of cellular oxygen delivery and utilization that the understanding of shock is defined. Without oxygen, the cells may survive briefly using anaerobic metabolism. Many of the physiologic defense mechanisms work to augment this delivery and depend on oxygen-carrying capacity, cardiac output, and oxygen delivery to and utilization by the cell.
The oxygen-carrying capacity of blood depends on the amount of circulating hemoglobin, which diminishes continually during hemorrhage. Although erythropoietin stimulates the production of new red blood cells (RBCs) and eventually restores hemoglobin over weeks, this response does not acutely restore oxygen-carrying capacity [6]. As hemorrhage proceeds, the body becomes incapable of supporting metabolic need. The primary defense, however, is the extra capacity inherent in the human system: only approximately 25% to 30% of the transported oxygen is normally used, leaving central venous or mixed venous oxygen saturations in the range of 70%. When fully stressed, extraction improves as anaerobic metabolism leads to lactic acidosis, which shifts the oxygen dissociation curve to favor release of oxygen at the tissue level. This allows much more oxygen to be removed from the hemoglobin, and much lower central venous oxygen saturations.
Cardiac output is the product of heart rate and stroke volume. There is reserve built into the heart rate, in that most people use only approximately two-thirds of their maximal heart rate. Pain, fear, and a variety of baroreceptors release catecholamines and other factors in response to hemorrhage. These lead to an increased heart rate, and thus increased cardiac output and oxygen delivery. With a few exceptions, in the elderly or those with heart disease, this response is maximally achieved by the body, in an unaided fashion.
The stroke volume can be increased by increased contractility through the direct effects of many of the same substances that increase heart rate. In hemorrhage, however, the primary component of cardiac output is the volume of blood coming into the heart (preload). During hemorrhage, the preload falls. As the blood pressure falls, oncotic forces predominate and fluid begins to shift into the vascular space. This “borrowing” of fluid from the interstitial, and ultimately from the intracellular, space is gradual, with a gradual restoration of the blood pressure—often not to normal—which allows time for the clotting mechanisms to stop the bleeding and stabilize the clot.
Other factors that restore the preload include the prevention of further fluid loss via the kidney. A lower blood pressure leads to less filtration and less fluid removed in urine. In addition, antidiuretic hormone and the renin–angiotensin systems act to augment this response. Catecholamines and large proteins circulate as part of the defense signaling systems. These augment the oncotic pull. The glucose that increases with the release of corticosteroids also acts to pull fluid into the vascular space. Finally, the body is willing to shunt blood away from most areas of the body to support cardiac preload and the brain. This shunting is very evident in the pale clammy skin of hemorrhagic shock. Initially it is less evident in the relative ischemia that occurs in every other organ of the body.
Oxygen delivery (DO2) to the tissues includes the variables of cardiac output, arterial oxygen content (CaO2, the total amount of oxygen in the blood), which includes the amount of hemoglobin that is present. During hemorrhage, these components are altered, and oxygen delivery may be decreased. Cardiac output can be indexed to body surface area and expressed as cardiac index, which when multiplied by CaO2 yields an oxygen delivery index (DO2I). Normal DO2I is roughly 450 mL per minute per m2 and it may increase by as much as 30% in response to injury. The primary goal of shock resuscitation is the early establishment of “adequate” oxygen delivery (DO2) to vital organs; however, adequate is subject to ongoing debate.
The complications of a “successful” resuscitation that should be watched for are related to ischemia and reperfusion injury. These may manifest as multiple organ dysfunction syndrome or individual organ dysfunction. Hepatic dysfunction
may present as jaundice and coagulopathy. Pulmonary dysfunction and acute respiratory distress syndrome may be seen as renal failure, with rising blood urea nitrogen and creatinine. Compromise of intestinal mucosa may lead to sepsis, bleeding, or perforation.
may present as jaundice and coagulopathy. Pulmonary dysfunction and acute respiratory distress syndrome may be seen as renal failure, with rising blood urea nitrogen and creatinine. Compromise of intestinal mucosa may lead to sepsis, bleeding, or perforation.
Immunology
Hemorrhagic shock alone, without tissue injury, was once thought to have minimal consequences [7]. Hemorrhagic shock alone has been shown to result in a multitude of responses, however, especially in the immune system. The immune system is intended to protect the body from infectious invaders and remove aberrant cells to prevent cancer. During shock, cells produce messengers or mediators that signal for the help of this system [8]. During reperfusion, these mediators are released widely into the systemic circulation.
Currently, a focus in hemorrhagic shock research is the effect of resuscitation on the immune and coagulation system. Extensive research in the last decade has shown that hemorrhagic shock from trauma activates both the inflammatory and coagulation system, resulting in profound perturbations in both. This is often manifested by a spectrum of clinical problems starting from acute lung injury, progressing to acute respiratory distress syndrome, systemic inflammatory response syndrome, hypo- or hypercoagulation, bleeding or diffuse thrombosis, and even multiple organ dysfunction syndrome [9]. One of the major areas of study involves the activated immune response that results in enhanced activation and increased adhesion of leukocytes. During this activated stage, neutrophils can release harmful reactive oxygen species, which are thought to play a major role in loss of capillary integrity. This leads to edema and the sequestration of fluid in the tissues outside the vascular space.
Although it has been clear that the immune response occurs in response to shock and reperfusion, it now seems that some of the resuscitation fluids used to treat the shock may trigger this altered immune and coagulation response. The immunologic response to various resuscitation fluids is now an area of intense research [10,11].
Hemorrhagic Shock Management
The first goal in hemorrhagic shock, following assessment of the ABCs (airway, breathing, and circulation), is to stop ongoing bleeding. In the surgical/trauma patient reaching the ICU, this has generally been accomplished in the emergency department (ED), interventional suite, and/or operating room. During the ICU phase, resuscitation is continued, and can last 24 to 48 hours. The goal of resuscitation is to restore normal perfusion to all body organ systems, using the components of oxygen delivery: hemoglobin, cardiac output, and oxygenation. In hemorrhagic shock, this primarily involves hemorrhage control, reversal of coagulopathy, and then administration of sufficient volumes of blood products and crystalloid fluid volume to restore normal aerobic metabolism.
Confirmation of a hypoperfusion state (shock) is obtained through simple examination and a single blood test. Shock is diagnosed by the effect of hypoperfusion on the body’s organ systems: low blood pressure, tachycardia, oliguria, tachypnea, decreased mental status or agitation, skin cyanosis, pallor, decreased pulse character, or mottling. Equivocal cases can be confirmed by obtaining an arterial blood gas and looking for a base deficit exceeding 6 or a serum lactate assay (more than 2 mmol per L). Hypoperfusion implies inadequate delivery of oxygen to the body’s cells. Oxygen delivery is a function of cardiac performance, arterial hemoglobin content, and arterial oxygen saturation. All attempts to correct shock involve optimizing these three variables. Hypotension is not synonymous with shock, which can be present in a normotensive patient. Conversely, not all hypotensive patients are in shock. Hypotension, like many other physical findings, is but one sign helpful in the overall clinical picture of shock diagnosis. As detailed below, reestablishment of normal heart rate, blood pressure and urine output does not equate to resolution of shock; resolution of tissue hypoperfusion as manifested by lactate clearance does.
Resuscitation of the patient in shock should be approached in two phases, based on the end points of the resuscitative effort. In the first phase, the patient should be resuscitated to a systolic blood pressure of 80 to 100 mm Hg or mean arterial pressure of 55 to 65 mm Hg, a urine output of 0.5 mL per kg per hour, and an arterial oxygen saturation of 93% or higher. These end points are pursued to prevent imminent death from hypoperfusion to the heart and brain, and should be achieved optimally within 1 hour.
In the second phase, resuscitation is continued with fluid, as well as inotropic and vasopressor agents, as needed, to the goal of eliminating the base deficit of metabolic acidosis, or, if available, restoring the serum lactate or base deficit to a normal level. This end point is important in reversing systemic anaerobic metabolism, which, if unrelieved, leads inexorably to multiple organ failure (MOF). This goal should be accomplished within 12 to 24 hours.
Lessons Learned from War
The modern-day trauma system owes a large debt to combat casualty care. Techniques from system development to operating room procedures have their roots in battlefield medicine. Resuscitation as well, is no stranger to advancement during wartime. To understand the advancements made and differences that exist with modern combat resuscitation strategies it is important to understand the history of combat resuscitation.
A modern ATLS resuscitation strategy of 2 L of crystalloid owes its roots to strategies developed during the Vietnam War. Based on research by Shires [12,13], Dillon [14], and others, the need for volume resuscitation was brought to the forefront to replace an interstitial volume debt secondary to intravascular movement in hemorrhagic shock. High volume crystalloid resuscitation strategies were used to replace volume loss encountered by the bleeding soldier in ratios of 3:1 to as high as 8:1. The physiology was sound, but disappointingly when outcomes were examined, clinical efficacy in the way of improved survival was not seen over previous war efforts with Killed in Action rates of 16% for the US Civil War, 19.6% for World War I, 19.8% for World War II, and 20.2% for the Vietnam War [15]. In fact, the adopted strategy of IV fluid administration would spawn its own set of complications, most notably the emergence of Da Nang lung known more widely now as acute respiratory distress syndrome. Initially felt to be the result of the volume of resuscitation, eventually its mechanisms linked to immunologic effects would come to be understood by Ashbaugh et al. in their case series of 12 patients (seven with trauma) published in the Lancet in 1967 [16].
High-volume crystalloid resuscitation strategies were further supported by Shoemakers early prospective study of 67 patients with greater than 2,000 mL of blood loss. Supranormal endpoints of resuscitation, defined as a cardiac index > 4.52 L per minute per m2, oxygen delivery ≥ 670 mL per minute per m2, and oxygen consumption ≥ 166 mL per minute per m2 were assessed against “standard” therapy. Survival was nearly double in the supranormal group as well as statistically significant decreases in length of ICU stay, mean number of organ
failures, and days of ventilation [17]. Despite these promising results, several other groups failed to achieve similar findings. More importantly with an ever increasing understanding of the immunology of intravenous fluids and resulting proinflammatory properties the complications of high-volume crystalloid resuscitation for combat casualties came into question.
failures, and days of ventilation [17]. Despite these promising results, several other groups failed to achieve similar findings. More importantly with an ever increasing understanding of the immunology of intravenous fluids and resulting proinflammatory properties the complications of high-volume crystalloid resuscitation for combat casualties came into question.
If aggressive crystalloid resuscitation was not the answer, then what would the optimal resuscitation strategy be? A report by the Institute of Medicine in 1999 as well as two consensus conferences held by Office of Naval Research, the US Army Medical Research and Material Command and the Uniformed Services University of Health Sciences in 2001 and 2002 tried to answer the question.
The IOM report was the first to recognize the several inadequacies of the then standard fluid therapy. First noted was the paucity of good Level I and II data to support the then standard of care. Second, the immunologic activity of common intravenous fluids used and deleterious effects of high-volume resuscitation was better defined as it related to complications [17]. This report would mark a significant paradigm shift. Initial recommendations were to remove the racemic mixture of D and L Lactated Ringers (still clinically available) in favor of L-isomer only. Replacement of lactate with ketones was advocated. Finally, the report supported the initial battlefield use of low volume hypertonic saline (HTS) resuscitation [18]. A 250-mL bolus of HTS was chosen based on research showing decreased neutrophil activation as well as increased oncotic properties as well as the battlefield logistics of less fluid to carry for frontline medics.
The 2001 consensus conference took it one step further by defining what the endpoints of resuscitation would be on the battlefield [19]. Triggers for fluid resuscitation would be systolic blood pressure less than 80 mm Hg or absence of palpable radial pulse, decreasing blood pressure, or altered mental status with no confounding brain injury [19]. This protocol allowed for “permissive hypotension” during resuscitation until definitive hemorrhage control. The goal was not to return blood pressure to normal, but rather to target clinical goals of mentation and palpable pulse. These protocols were developed with several civilian trauma studies in mind.
The first by Bickel and Mattox done at the Ben Taub in which 598 adult patients sustaining penetrating torso trauma with a systolic blood pressure less than 90 were assigned to either standard fluid therapy with Lactated Ringers or IV cannulation with no fluid infusion. Although controversies with study design and protocol surround the results, a significant survival benefit 70% versus 62% was seen for the delayed resuscitation arm [20].
Second were several studies that suggested early aggressive fluid resuscitation before hemorrhage control may have a deleterious effect. As early as 1964, Shaftan et al. published data showing the effects of aggressive volume correction slowed spontaneous control of arterial bleeding [21]. This was followed by military research data done in swine by Bickell et al. Adult swine had their infrarenal aorta cannulated with a stainless steel wire. The wire was pulled creating a 5-mm aortotomy and free intraperitoneal hemorrhage. Eight pigs received 80 mL per kg of Lactated Ringers where the control group received nothing. Hemorrhage was significantly higher in the intravenous fluid group (2,142 ± 178 mL vs. 783 ± 85 mL, p < 0.05) as well as mortality (8 of 8 vs. 0 of 8, p < 0.05) [22]. This ultimately culminated in a complete 180-degree shift from the high volume crystalloid resuscitation seen in the Vietnam War.
If awake, alert, and having a palpable pulse, a soldier sustaining a penetrating wound should have an IV placed, but no fluids would be infused. PO fluids would be encouraged and evacuation undertaken to the next level of care. If resuscitation had to be undertaken, again recognizing a low-volume strategy the recommendation of the panel was for 500 mL hetastarch (Hespan or Hextend) as FDA approval for HTS was lacking. The hetastarch bolus could be repeated at which point a reassessment was done and if no response the possibility of futility was entertained [23].
Expanding on this the 2002 consensus conference held in conjunction with the Canadian Defense and Civil Institute for Environmental Medicine reexamined prehospital requirements for fluid therapy. The “hypotensive” strategy was again approved, but the recommendation for initial battlefield fluid was changed to hypertonic saline dextran (HTS-D) based on then current research showing a favorable volume expansion profile of the dextran with the inflammatory inhibition of the HTS component [24,25].
Current strategies in the Iraq and Afghanistan wars are very similar. First and foremost, the problem had to be defined with the unique set of circumstances that are present in live fire situations. The first point of care would be the battlefield medic. It was recognized that logistical problems exist in bringing care to the wounded at the point of injury. Hemorrhage control still remains the first priority in resuscitating the injured patient, for if quick, effective hemostasis cannot be achieved fluid therapy has no hope of working in austere environments where definitive therapy may be hours away [23]. This has led to the reintroduction of vascular tourniquets, the use of Battlefield hemostatic dressings, and newer therapies such as Factor VII to arrest hemorrhage so that resuscitation efforts can be effective, a discussion of which is beyond the scope of this chapter.
As recognized in the previous consensus conferences, if medics are to be mobile and effective on the battlefield they need the ability to carry their supplies with them [18,19,23,24]. This makes low-volume intravascular expansion much more attractive. For this reason, colloid solutions, specifically Hespan or Hextend, continue to be the fluid of choice for military applications [23]. HTS-D has fallen out of favor due to more current civilian prehospital data that has shown an increase in mortality in trauma patients during interim analysis of the recent ROC trial [26].
With the choice of fluids now made (Hespan or Hextend), the next decision point is how to get those fluids into an injured soldier. Trauma providers know the key tenet of ATLS “two large-bore IVs in the antecubital fossa.” This principle becomes increasingly difficult in combat conditions. To this end, the US military takes a different approach. If awake, alert and having a palpable radial pulse, a wounded soldier with a palpable radial pulse have a single 18-gauge peripheral IV placed (chosen for ease of cannulation versus a larger bore IV) and PO fluids encouraged [23]. If IV access cannot be obtained or conditions will not allow access, a sternal intraosseous device is placed. Sternum was chosen as the reproducible target as extremity injuries prevail in current warfare and the trunk remains relatively protected with modern armor. The sternal IO can be placed with reproducible landmarks quickly and in low- or no-light conditions making it extremely beneficial in modern combat [23].
Resuscitation then continues as appropriate with evacuation to the next level of care. It is at this level that the paradigm has shifted dramatically. The emphasis now is on damage control. This pertains not only to the way in which the operations are done (quick procedures leaving abdominal wounds open, temporary packing for hemorrhage control, and temporary vascular shunts) but also to the way in which resuscitation is continued. The use of early blood and coagulation component therapy as well as fresh whole blood (FWB) is emphasized. Again logistics dictate limited storage capabilities in far forward treatment centers. This continues to promote a walking blood bank using fellow combat troops as donors, a luxury not afforded by the civilian trauma provider.
Clinically, FWB has been demonstrated to reverse dilutional coagulopathy, with evidence that a single unit of FWB has a
hemostatic effect similar to 10 units of platelets [27,28,29,30,31,32,33,34]. In a retrospective study of the results of the FWB procedures for one U.S. Combat Support Hospital in 2004, 87 patients received 545 units. In that experience the FWB drive was called for only after the patient had received a massive transfusion, yet the transfusion of FWB resulted in significant improvements in both hemoglobin concentration and coagulation parameters [32].
hemostatic effect similar to 10 units of platelets [27,28,29,30,31,32,33,34]. In a retrospective study of the results of the FWB procedures for one U.S. Combat Support Hospital in 2004, 87 patients received 545 units. In that experience the FWB drive was called for only after the patient had received a massive transfusion, yet the transfusion of FWB resulted in significant improvements in both hemoglobin concentration and coagulation parameters [32].
The nature of military medical logistics frequently limits the availability of FFP, platelets, and cryoprecipitate for transfusion in theaters, giving the battlefield physician few options in the treatment of traumatic coagulopathy. However, the use of FWB in massively transfused patients may circumvent the problem of dilutional coagulopathy. Consider the usual mixture of one packed RBC unit (335 mL) with a hematocrit of 55%, one unit of platelet concentrate (50 mL) with 5.5 × 1010 platelets, and one unit of FFP (275 mL) with 80% coagulation factor activity. This combination results in 660 mL of fluid with a hematocrit of 29%, 88,000 platelets per μL, and 65% coagulation factor activity. By definition, transfusion of these standard components will only serve to further dilute critical factors in a bleeding casualty. In contrast, FWB is replete with functional platelets as well as fully functional clotting factors. A 500-mL unit of FWB has a hematocrit of 38% to 50%, 150,000 to 400,000 platelets per μL, and 100% activity of clotting factors diluted only by the 70 mL of anticoagulant [35]. In addition, the viability and flow characteristics of fresh RBC are better than their stored counterparts that have undergone metabolic depletion and membrane loss.
Initial retrospective studies by Holcomb found higher 24-hour (96% vs. 88%, p = 0.018) and 30-day (95% vs. 82%, p = 0.020) survival in a group of combat casualties when FWB was used [36]. The immunology and pathophysiology of improved clinical outcomes continues to be an active area of research. Also reported from military and civilian evidence is that higher ratio FFP to PRBC improves outcomes [37,38,39]. The exact ratio is still part of ongoing research, with some evidence suggesting that there may be a survival bias in those patients receiving higher ratios. Despite these controversies, the early and aggressive use of blood and coagulation factors forms the cornerstone of damage control resuscitation.
Damage Control Resuscitation
The concept of damage control resuscitation or hemostatic resuscitation has rapidly evolved on the modern battlefield. This concept is philosophically derived from the widely practiced damage control surgery approach to severely injured patients. Understanding the epidemiology of combat casualties is paramount to devising a logical resuscitation strategy. Most deaths (80%) in combat operations are not preventable [40,41]. Of the remaining 20% of potentially preventable deaths in combat casualties, two-thirds are from hemorrhage. Furthermore, the killed in action rate is lower than at any time in history, while the died of wounds rate has increased, largely due to improved body armor, rapid evacuation, improved extremity hemorrhage control, and medic training [40]. With the recent widespread use of tourniquets and hemostatic dressings for compressible hemorrhage control, the current unmet need is for rapid, effective interventions for noncompressible hemorrhage from the neck, axilla, thorax, abdomen, groin, and pelvis.
Fortunately, most casualties receive at most one to four units of packed RBCs after injury and are not at high risk of presenting or developing a coagulopathy and subsequently dying [42]. Only 5% to 10% of all combat casualties require massive transfusion (10 or more units of packed RBCs) and this group constitutes those at risk for hemorrhagic death [43]. These same patients are those who will benefit from early use of recombinant activated factor VII (rFVIIa), as described in the Clinical Practice Guideline (Table 158.1).
The 5% to 10% of all combat casualties that require massive transfusion fall into two broad categories. Group 1 patients are the wounded who are clearly in profound shock, arrive moribund, and are resuscitated with heroic efforts. These casualties do not pose a diagnostic dilemma; rather, they require immediate hemorrhage control and very rapid resuscitation with the optimal ratio of all available products. Surgically, the only question is what cavity to enter first, as they usually have multiple significant injuries. Frequently, these casualties have severely injured extremities, requiring life-saving tourniquets and delayed completion amputations after successful truncal hemorrhage control. These casualties, if surviving the initial 10 to 15 minutes resuscitation in the ED, require the full massive transfusion protocol and surgical intervention described in the following sections.
Group 2 patients are more difficult to recognize. They are typically the young soldier with incredible physiologic reserve who arrive “talking and looking good,” who are actually in shock, have had significant blood loss, and soon progress to cardiovascular collapse. This classic presentation occurs once a week at a busy combat hospital. The challenge is rapidly separating these critical casualties from those who are really hemodynamically stable. These casualties require rapid and accurate diagnosis of their hemorrhagic injury. This group needs immediate hemorrhage control, as fast as group 1; however, they are much more difficult to initially diagnose. Traditional reliance on mental status, blood pressure and pulse rate is notoriously inaccurate for individual risk stratification [44,45,46,47].
Fortunately, there are five risk factors that are easily identified very early in the hospital course of severely injured casualties, each of which independently predicts the need for massive transfusion and/or increased risk of death. These simple variables are now available within 2 to 5 minutes after presentation in every ED and each of these variables is independently associated with massive transfusion or death after trauma; any one of them should prompt activation of the massive transfusion protocol (discussed later).
First, an initial international normalized ratio (INR) of 1.5 or more reliably predicts those military casualties who will require massive transfusion [48,49,50] Patients who have a significant injury present with a coagulopathy as a marker of severe injury. Severity of injury and mortality is linearly associated with the degree of the initial coagulopathy [35,47,48,49,50]. Second, a base deficit of 6 or more is strongly associated with the need for massive transfusion and mortality in both civilian and military trauma. Patients have an elevated base deficit before their blood pressure drops to classic “hypotension” levels [51,52,53]. Third, a temperature of 96°F or less is associated with an increase in mortality. Trauma patients who are hypothermic are in shock, not perfusing their mitochondria, and are not generating heat fast enough to keep up with their ongoing heat loss [52,53,54]
Fourth, a hemoglobin of 11 mg per dL or less on presentation to the ED is associated with massive transfusion and a mortality rate of 39% [43]. Otherwise, young healthy soldiers who present with a low hemoglobin have only one reason for their anemia, namely, acute blood loss [43,55]. Lastly, a systolic blood pressure of 90 mm Hg or less is indicative casualties who have lost more than 40% of their blood volume (2,000 mL in an adult), are experiencing impending cardiovascular collapse, and have a significantly increased mortality [56,57].
The current resuscitation protocol for combat casualties not only has an affect on current military outcomes (initial reports show Case Fatality Rates dropping from a historic 20% to close to 10%), but has provided exciting tools for civilian trauma providers [40,58].
Table 158.1 U.S. Central Command Clinical Practice Guideline for Use of Recombinant Factor VIIA (RFVIIA) and Thawed Plasma | ||
---|---|---|
|
Emphasis on early hemorrhage control and damage control resuscitation through aggressive replacement of blood component and coagulation factors still needs further study, but remains one of the positive hallmarks of modern combat medicine. From the point of injury on the battlefield to the arrival at definitive care facilities the current combat casualty enters into a well thought out system of multiphasic resuscitation with specific goals to be achieved at each level; early hemorrhage control, limited intravascular replacement until definitive control is available, and the early use of blood and coagulation factors in a damage control resuscitative strategy.