Significant Developments in the 1990s



Fig. 12.1
Sixteen new anesthesia societies appeared in the 1900s, several from the new countries that arose from the ashes of the Soviet Union





Better Neuromuscular Blocking Drugs? Yes and No


The search for better neuromuscular blocking drugs continued, particularly the search for a drug with a faster onset and an assured termination of effect. Succinylcholine fitted that description but it caused muscle pain, untoward increased potassium release, and prolonged effects in patients with pseudocholinesterase deficiency. So the search went on. Organon synthesized rocuronium, and in 1989, the first report of its use in animals appeared [1]. Within a year, results in humans were published, [2] leading to its release for clinical use in 1994. It appeared to have a faster onset than the then competitor, vecuronium and many anesthetists used it. A competing company, Burroughs Wellcome (now GlaxoSmithKline) had manufactured the rapid-acting drug atracurium. However, atracurium caused histamine release/allergic reactions, making it less than perfect. A stereoisomer of atracurium, cisatricurium avoided these side effects. In 1995, cisatricurium dosing and recovery characteristics were reported for humans, [3] and it became clinically available soon thereafter. Like atracurium, cisatracurium elimination did not require renal or hepatic function, instead depending on Hofmann elimination (hydrolysis). Recovery was therefore assured and many anesthetists felt more secure with its use.

But neither rocuronium nor cisatracurium were perfect; neither produced paralysis as rapidly as did succinylcholine. In 1999, Organon reported synthesis of a drug with these properties, rapacuronium [4]. Initial uses in hundreds of patients gave promising results. Unfortunately, when hundreds of thousands of patients were given rapacuronium, a few (especially children) developed severe bronchospasm, [5] and in 2001 rapacuronium was removed from the market.


The Continuing Advance of Inhaled Anesthesia


In much of the world, halothane was the inhaled anesthetic of the 1960s; enflurane, the 1970s; and isoflurane, the 1980s. By the 1980s, the practice of surgery increasingly turned from a hospital-based to an ambulatory-based practice, taking with it or being led by anesthesia. The “come-and-go” nature of ambulatory surgery increased the need for anesthetics with more transient effects, anesthetics that allowed a more rapid recovery, and safe release of patients to go home. Thus the shift to propofol in the 1980s. In the 1990s, desflurane and sevoflurane, poorly soluble inhaled anesthetics that complimented the rapid recovery from propofol, were released for clinical use.

In 1990, Ron Jones described some of desflurane’s effects in volunteers [6]. Other studies in humans concurrently detailed desflurane’s cardiorespiratory, neuromuscular, metabolic and cerebral effects. Anaquest pursued the commercialization of desflurane, with presentations to the FDA in 1991 that gained desflurane’s approval and allowed marketing in 1992’at the same time that isoflurane’s patent expired and its profitability to Anaquest declined (as told in Chapter 10, isoflurane had been synthesized by Anaquest employee, Ross Terrell, in the 1960s and commercialized by Anaquest in the 1980s).

Sevoflurane was resurrected at about the same time as was desflurane. Terrell had also synthesized sevoflurane in the 1960s but had not pursued its development because it was unstable in the presence of a base, as found in CO2 absorbents. Bernard Regan at Travenol Laboratories had independently synthesized sevoflurane. Studies by Richard Wallin detailed some of sevoflurane’s pharmacology [7,8]. Wallin appreciated but was less deterred by the vulnerability of sevoflurane to degradation. In 1981, Duncan Holiday reported that sevoflurane induced anesthesia rapidly and smoothly in volunteers [9]. For a time, that was as far as it got; Some laboratory studies had found toxic effects of sevoflurane in rats [10,11], and so it was put aside.

Baxter (successor to Travenol) approached Anaquest in 1987 to discuss licensing of sevoflurane by Anaquest. After a year of considering that possibility, Anaquest rejected Baxter’s offer, reasoning that their then-being-developed desflurane’s lesser solubility and greater stability made it superior. Furthermore, by a fortunate oversight, desflurane had patent protection. In the 1960s, Terrell had made desflurane using a process that required elemental fluorine. It was dangerous and expensive to make, and its lesser potency (a third that of isoflurane) meant that much more would have to be produced’adding to the expense. Terrell thought so little of desflurane’s prospects that no patent had been obtained and no publication had detailed its properties. The absence of public declarations meant that desflurane could be patented as though it had been discovered yesterday. In contrast, sevoflurane had been in the public domain for decades and had, at most, a 5-year exclusivity protection. Rebuffed by Anaquest, Baxter sold sevoflurane to Maruishi, a small Japanese pharma, who after further study brought sevoflurane to market in Japan in 1990 where it was a success…without reports of injury. Maruishi then turned to Abbott Laboratories which in 1992 accepted an offer to license sevoflurane for worldwide sales excluding Japan and China [12]. In 1993, Abbott made sevoflurane available.

The proponents of sevoflurane and desflurane had grand struggles in the 1990s. Statistically significant differences were found that didn’t always translate to meaningful clinical differences. In 1991, Nobu Yasuda showed that the kinetics of desflurane were superior to those of sevoflurane [13,14]. Studied by Richard Weiskopf in volunteers in 1992, desflurane did not produce evidence of renal injury [15], but five years later, Eger showed that prolonged anesthesia with sevoflurane at 2 L/min inflow rate could produce nephrotic levels of albumin in the urine of volunteers [16], a finding confirmed in studies of patients by Higuchi et al. in 1998 [17]. Such transient injury correlated with the amount of a degradation product of sevoflurane, compound A, that the subject breathed [1619]. These findings appeared to confirm Anaquest’s decision to pursue desflurane rather than sevoflurane. However, although injury might be produced (and even that was not accepted by many sevoflurane proponents), the injury required prolonged (8-hour) exposure at relatively high sevoflurane concentrations (1.25 MAC) with fresh Baralyme® absorbent (which produced greater amounts of compound A from sevoflurane degradation than did other absorbents or partially used absorbents). And the injury was transient. Thus the proponents of sevoflurane might reasonably dismiss any injury as clinically unimportant.

There is a delicious irony to this story. In response to Eger’s (yes the same Eger as the editor of this book) compelling testimony to the FDA regarding the potential for renal injury, sevoflurane administration in humans in the US was limited by the FDA to 2 MAC hours at 1 L/min for two hours after which the fresh gas flow had to be increased to 2 L/min. Because of this proscription, however, most anesthetists used the higher flow (2 L/min’or even 3 L/min) thus generating greater profits for Abbott (the licensee from Maruishi), the rival company to that producing desflurane, the anesthetic favored by Eger. Who would have guessed?

More relevant was the finding that desflurane, but not sevoflurane, might stimulate the cardiovascular system [20,21], and irritate the airway (Fig. 12.2) [22]. Such clinically important unwanted effects of desflurane ultimately caused the popularity of sevoflurane to eclipse that of desflurane. Too late, it was appreciated that such unwanted effects of desflurane only became apparent at concentrations exceeding 1 MAC [6,23], and that the concurrent use of opioids like fentanyl diminished or abolished such effects [24].



A978-1-4614-8441-7_12_Fig2_HTML.gif


Fig. 12.2
Ter Riet et al. induced anesthesia in patients with 2 MAC of desflurane, isoflurane, or sevoflurane, finding postoperatively that few if any complained of irritation of the airway when anesthetized with sevoflurane [22]


Dexmedetomidine and Mivazerol


The alpha2-adrenoceptor agonist dexmedetomidine had been developed in the late 1980s, and this continued to its approval for human use in the 1990s. In 1988, Vickery et al showed that it decreases anesthetic requirement (MAC) in dogs, and if given in sufficient dosage could suffice, alone, for anesthesia [25]. It does more than just affect anesthetic requirement. It decreases heart rate, and as shown in the late 1990s, it and a sister drug, mivazerol could protect the heart with coronary artery disease from ischemic injury [26,27].


Advances in Local Anesthetics and Their Delivery


Placement of catheters in the cerebrospinal fluid of the spinal canal requires making a hole in the arachnoid membrane of a size that predisposed to leakage of cerebrospinal fluid and thereby increased the potential for “spinal headache”. It seemed obvious that the smaller the catheter, the better. Animated by this notion, in 1991, Rigler and Drasner used such catheters to produce spinal anesthesia with lidocaine. To their surprise, they found an associated spinal cord injury [28]. Studies suggested that this resulted because the slow injection, forced by the narrowness of the microbore catheter, concentrated the injected lidocaine, and because lidocaine had a greater inherent toxicity than previously appreciated. Further study led to the 1993 observation that even when a microbore catheter was not used, the subtle neurotoxicity of lidocaine produced a “transient neurological syndrome” (TNS) [29]. This effect was not seen with bupivacaine [30].The TNS peculiarly associated with lidocaine, plus the rare serious injury, markedly decreased the use of lidocaine as a spinal anesthetic.

Local anesthetics also have a dose-dependent capacity to cause untoward effects from their systemic presence. In particular, bupivacaine can cause profound cardiac depression [31]. Investigators attacked such toxicity in several ways. One effective strategy proposed in the 1980s decreased bupivacaine dosage, using a smaller concentration as well as smaller incremental volumes [31]. A less effective one took advantage of potential differences in bupivacaine’s two enantiomers, leading to the 1996 marketing of ropivacaine [levobupivacaine, the S (−) enantiomer] which has less affinity for cardiac sodium channels than the R (+) mirror image [32]. But the differences in toxicity between bupivacaine’s enantiomers were small and probably of little clinical significance. Weinberg described another strategy for dealing with toxicity, in 1998 finding that lipid infused intravenously could absorb and thereby decrease the availability and toxicity of bupivacaine [33].

Finally, accurate placement of local anesthetic to achieve anesthesia enhances the safety of local anesthetics/regional anesthesia by minimizing the amount of local anesthetic required and by avoiding injection of anesthetic in blood vessels. In 1994, University of Vienna investigators described ultrasound identification of peripheral nerves (specifically the brachial plexus) [34]. This advance revolutionized regional anesthesia, not only decreasing the amount of anesthetic required and potentially reducing complications, but also decreasing patient discomfort since it avoided the use of paresthesias to guide needle placement. To that time, paresthesias or anatomical landmarks had been the standard approaches to identifying the correct position of the tip of the needle used to inject the local anesthetic.


Outcomes Studies and Mundane Checklists Add to Safety


Various 1990s outcomes studies increased the safety of anesthesia. Several studies built on the 1985 observation by Slogoff and Keats that a slower heart rate protected against postoperative myocardial infarction in patients having coronary artery bypass operations [35]. In the late 1990s–2000s, Mangano [36] and Wallace [37] found that administration of atenolol or clonidine to decrease beta adrenergic activity, decreased mortality in patients having operations for cardiac disease.

Inhaled anesthetics could also play a protective role. In 1997, Cason’s group showed that isoflurane administration produced cardioprotection akin to protection produced by “ischemic preconditioning” [38]. Other potent inhaled anesthetics had this effect, a cardioprotection that long (days) outlasted the presence of the anesthetic [39]. Thus in 2003, De Hert showed that the use of anesthetics such as desflurane or sevoflurane could greatly decrease myocardial injury from the myocardial ischemia inherent to coronary artery surgery [40]. On the other hand, injected anesthetics could sometimes play an untoward role. In 1992, Parke et al. found that prolonged (days) infusion of propofol produced a profound acidosis, lipidemia, and myocardial failure that could be lethal to children [41]. Whether this resulted from the vehicle used to deliver propofol, to propofol itself, or to some other factor was unclear.

The ability of the anesthetist to control physiological variables could increase patient welfare. In 1996, Kurz, Sessler and Lenhardt showed that maintaining normal body temperature protected against wound infection after colon resection [42]. This study profoundly influenced practice in the US and worldwide. Use of forced air warmers to maintain normothermia in all but brief procedures became standard. In 1997, Hopf and Hunt demonstrated that breathing greater concentrations of oxygen decreased wound infection and hastened healing [43]. It is not clear that this prompted the use of greater inspired concentrations of oxygen.

The US Institute of Medicine report “To Err Is Human” published in 1999, suggested that anesthesia-related mortality decreased from 2 deaths per 10,000 anesthetics in the 1980s to about 1 death per 200,000 a decade later [44]. This admirable increase in safety resulted from numerous worldwide initiatives. For example, in 1991, Gaba convened a conference focused on human error in anesthesia, leading to the introduction of algorithms and checklists similar to those used to ensure safety in commercial aviation. In 1999, Pronovost et al. reported on the use of such an approach to decrease morbidity and mortality from venous catheter induced infection [45]. Not everyone agreed that the decreased overall rate of mortality was as dramatic as suggested [46].

Safety advanced worldwide. To do this, less developed countries relied on proven simple approaches to safety, approaches disseminated by means that such a world could access and afford. In 1992, the World Anaesthesia Society, a Specialist Society of the Association of Anaesthetists of Great Britain and Ireland, funded ‘Update in Anaesthesia’ the international journal of the World Federation of the Societies of Anaesthesiologists (WFSA), with supplemental funding by the UK Overseas Development Administration.1 The WFSA adopted ‘Update in Anaesthesia’ to spread knowledge of the science and safe practice of anesthesia in developing countries. Free copies were sent to 3,000 English-speaking anesthetists. Free download from the internet was available in English, French, Mandarin, and Spanish.

Electroconvulsive therapy (ECT) remains as the primary therapy for major depression. It comes at a price. Patients correctly apprehend that such therapy can decrease mental function and can be associated with prolonged memory loss. With his anesthesia colleagues in Vienna, psychiatrist Langer hypothesized that the electrical silence that followed convulsions, rather than the convulsions, themselves, produced the desired effect, resetting the brain. For those with limited to no computer expertise, this notion is similar to the underlying cure of computer woes sometimes magically achieved by turning the computer off and then on again. Langer continues with his proposal: Perhaps using inhaled anesthetics to achieve electrical silence might produce the same effect as convulsions without causing memory loss. Consistent with that hypothesis, in 1985, they demonstrated that deep anesthesia with isoflurane, inducing a period of ‘electrocerebral silence,’ reduced depression in 9 of 11 subjects [47]. They confirmed these results in 1995 [48].

In 1993, a separate group from Wurzburg confirmed that burst-suppression-level isoflurane was as effective as ECT [49], with the caveat that anesthesia required more time and monitoring than ECT. Perhaps this explains why the use of isoflurane for management of major depression has not become routine. Or perhaps anesthetic management of depression is not used because psychiatrists make treatment decisions and because the evidence supporting treatment with isoflurane came from small studies, only some of which were randomized and controlled. But it seems a pity this isn’t resolved. We’d rather have a brief anesthesia than ECT for our depression.


Advances in Equipment


Before the 1990s, hospitals and anesthetists purchased anesthesia machines and monitors separately, this offering the advantage of modular selection of the “best” device. However, displays, controls and alarms varied, predisposing to confusion. In the 1990s, purveyors of anesthetic equipment integrated the anesthesia machine, ventilator and monitors into an “anesthesia workstation” in which all modalities were displayed in one place, with alarms coordinated and prioritized. But the cost of the new machines could approach $ 100,000.

Engstrom (later Datex-Engstrom and then General Electric) released the Anesthesia Delivery Unit (ADU), incorporating a new form of the variable bypass vaporizer, in 1995. This was the first anesthesia machine where gas flows through the bypass versus the vaporizer sump were controlled by a computer. The computer control-module was separate from the anesthetic sump. The former was programmed to deal with any of the anesthetics (including desflurane) while the latter (the sump) was specific to the anesthetic. Based on feedback from agent monitors, the computer-controlled flows were adjusted to deliver the agent concentration dialed on the anesthesia machine. Any anesthetic, including desflurane, could be used with this variable bypass vaporizer. As suggested in the previous paragraph, the vaporizer was integrated into a complete anesthesia machine, the ADU.

In 1981, Schwilden supplied an analysis that suggested the feasibility of a computer-controlled infusion of an anesthetic or anesthetic adjuvant drug in a manner that achieved a targeted concentration at the effect site of the drug, wherever that might be [50]. By the 1990s, several investigators had applied Schwilden’s approach (e.g., see Raemer et al. [51] and Shafer et al. [52]) By 1995, machines to provide a Target Controlled Infusion of propofol (e.g., the ‘Diprifusor’) [53] were approved in much of the world…but not in the US.

SIM 1 the first computer-controlled anesthesia-dedicated mannequin simulator was made for use at the University of Southern California in the 1960s [54]. Although it appeared promising, it was never released commercially. In the 1980s, David Gaba at Stanford University [55], and Michael Good and JS Gravenstein at the University of Florida created simulators for investigating human performance, training and safety in anesthesia [56]. And from the late 1990s through to 2010, training and recertification with computerized simulators grew rapidly despite limited evidence of sustained efficacy. In 2000, there were less than 200 computerized mannequin simulators, but by 2011 this had increased to more than 7000 (see Chapter 37).

Only gold members can continue reading. Log In or Register to continue

Mar 21, 2017 | Posted by in ANESTHESIA | Comments Off on Significant Developments in the 1990s

Full access? Get Clinical Tree

Get Clinical Tree app for offline access