Share This Page
Though the versatile human brain, the domain of thought and imagination, may seduce us into believing it is somehow an organ apart from the body, it is emphatically physical. Its demands are great, consuming 20 percent of all body oxygen and 25 percent of all glucose, our body’s basic fuel. When this life support is disrupted by, say, traumatic injury, or during a surgical procedure, the brain begins to do what all organic matter does—spoil. We use refrigerators to stall the putrefaction of foods, and cooling has a similar effect on the brain. Scientists call the preservative cooling of the living human brain “induced hypothermia.”
Medical science’s interest in inducing hypothermia to preserve damaged brain tissue has waxed and waned since the technique was used with any frequency in the 1930s. After a couple of decades of popularity, attention dropped dramatically for almost 30 years. Part of the problem was the imprecision of techniques for inducing hypothermia, as well as the dangers of deep cooling. In particular, the time and the effort involved in surface-cooling a patient who experienced cardiac arrest or stroke kept hypothermia from becoming an established intervention. To save a patient’s brain tissues after a heart attack or stroke, after all, time is of the essence. We now know that during a stroke the human brain undergoes irreversible deterioration at a terrifying pace. Every second counts.
To establish hypothermia as a viable intervention newer, faster, and better-controlled cooling techniques were needed. Beginning in the 1980s, such technologies began to appear, prompting reconsideration of hypothermia’s value to brains in crisis. Today, new methods for cooling are emerging that are, in the words of one neuroscientist, “technologies in search of an application.” Cooling devices are being developed with the expectation that induced hypothermia may become a common intervention. Although few hospitals have full-ﬂedged hypothermia programs, there is a buzz in medicine and medical industry about the possibilities of reducing the brain’s metabolic needs through cooling, particularly after head trauma. But the question remains: How can the living brain be cooled almost to freezing, maintained in that state with its consumption of metabolic fuel and oxygen reduced to a mere trickle, and then revived with no damaging side effects?
A natural process found in the animal kingdom suggests a tantalizing solution to the problem: hibernation. Scientists have long studied this still mysterious process by which some warm-blooded animals enter a unique state of torpor. Their body temperature drops dramatically, the activity of their central nervous system slows down, and their use of fuel and oxygen (their metabolism) sinks to incredibly low levels, eliminating their need for food during the cold winter months. Hibernation researchers believe that the process might have applications in medicine, but must also confront the fact that humans do not hibernate. Only recently have scientists studying hibernation found evidence to suggest that humans may have an innate ability to hibernate, but that the ability remains latent, perhaps because we have houses and heating to keep us awake and active through the winter. The recent discovery that a reversible state of slowed-down metabolism closely resembling hibernation can be induced in mice, which do not naturally hibernate, has also raised the possibility that one day humans may in this way have their metabolism reduced to allow for prolonged brain or heart surgery that necessitates the cessation of blood circulation.
Those who study hibernation and hypothermia have a lot to learn from each other. To be effective, therapeutic hypothermia has needed a faster, better-controlled way to achieve cerebral cooling. Scientists studying hibernation, which results in a lowered body temperature, have become fascinated with its latent potential in humans and its ability to reduce the body’s metabolic needs. This, then, is the story of how hibernation and hypothermia researchers are becoming close allies in the battle to save oxygen-deprived brain tissue.
How Hibernation Works
At least as early as 1896, when French physiologist Raphaël Dubois published a seminal study on the periodic arousal of marmots from their winter rest, scientists have tried to understand how mammals survive harsh environmental conditions, such as cold winters, by slowing or arresting their metabolic processes, thereby dropping their temperatures signiﬁcantly and slowing their respiration until it is barely perceptible.
What actually happens when a mammal enters hibernation is still not fully understood, especially at the molecular level. According to Hannah Carey, Ph.D., a professor of comparative biosciences at the University of Wisconsin-Madison School of Veterinary Medicine, true hibernation is characterized by prolonged periods of torpor (days or weeks), allowing the animal to conserve its energy. When temperatures drop, food supplies are short, or, in some cases, when water is scarce, a mismatch exists between energy supply and energy expenditure, and it becomes too costly for animals to maintain their respective metabolisms. Hibernators enter a prolonged dormancy, occasionally re-emerging to feed or excrete waste, only to subside once again into what Carey calls a “natural cold-storage state.”
“There is a great diversity in the hibernating world,” says Carey. “Different animals respond to different Zeitgeber, or temporal clues, such as light, food supply, and temperature. In general, think of it as taking your foot off the accelerator of your car.”
Like a cup of coffee left sitting to cool, a hibernating animal’s body also cools when its metabolism slows, and it no longer makes an effort to stay warm. During hibernation, warm-blooded animals that maintain a speciﬁc body temperature (endothermic creatures) come to resemble cold-blooded animals whose temperature varies with the surrounding temperature (ectotherms). This means, Carey says, that for hibernating mammals in the extreme north, such as the Arctic ground squirrel, core body temperature hovers at 28°F in a surrounding permafrost of -40°F.
Physiological changes occur as an animal—a groundhog, say—prepares for its winter rest. Fur thickens and fat builds up as the groundhog prepares its hibernation chamber. Whether these changes are initiated by some genetic trigger, by an opiate-like molecule commonly known as the hibernation inducement trigger, or through crosstalk between neurons in the hypothalamus—the mammalian “thermostat”—is unclear. What we do know is that, as a groundhog fashions a burrow, its brain tells its body to stop preserving metabolic levels: the heart and respiration rates slow, oxygen consumption decreases, thereby depressing metabolism, and the animal’s core body temperature drops closer to the ambient temperature. Other changes in the hibernating animal’s body can include increased antioxidant defense, immunosuppression, and lower blood coagulation. Perhaps most important of the metabolic changes is the slowed production of adenosine triphosphate molecules, or ATP, which is the energy used by an organism in its daily operations. Through a process known as oxidative phosphorylation, ATP is broken down, and the energy released is what the body uses to keep itself alive. Less ATP, therefore, means less available energy.
Seeking the “Hibernating Chemical”
Hibernation interests medical researchers because it is a kind of controlled, reversible hypothermia. A groundhog enters a state that, in appearance, resembles death. Yet, every spring the groundhog and legions of other animals re-emerge shivering and hungry with no neurological or physical damage. Imagine the possibilities—for medicine, for space travel—if hibernation could be mimicked in humans.
Such possibilities were a passion of an early pioneer of hibernation studies, Wilfred Bigelow (1913-2005), a Canadian cardiologist best known as developer of the ﬁrst successful pacemaker. As a surgeon, he believed that the heart needed to be opened and operated on directly if cardiac interventions, especially open-heart surgery, were to progress. To do this, he proposed cooling the human body to reduce cerebral oxygen requirements, which would allow circulation to be interrupted. A short ﬁlm he presented to the American Surgical Association in 1950 showed him and his team cooling a dog’s core temperature to 68°F and stopping circulation, an event that spurred hypothermia research around the world.
Nonetheless, Bigelow was aware of hypothermia’s dangers and obstacles. Writing in the journal Surgery in 1958, he complained that inducing hypothermia with cold packs or ice baths was too slow. Moreover, shivering, the skin’s reaction to dropping temperatures, was difﬁcult to control. He also listed cardiac ventricular ﬁbrillation, limited coagulation, lowered immune response, and shock during rewarming as some of hypothermia’s possible negative effects.
“Perhaps the answer lies in hibernation,” he suggested, and noted that the hibernating groundhog can be cooled to 37.4°F and survive total circulatory arrest for more than two hours with no apparent neurological damage. For about 10 years, Bigelow’s research mission was to explore the nexus that unites hibernation and hypothermia. Working under Bigelow’s supervision at the Banting Institute of Toronto General Hospital, a generation of surgical trainees dug up hibernating groundhogs from the snows around Toronto in an attempt to learn the secret of their winter rest.
Bigelow wrote hopefully in his Surgery article, “Perhaps the infusion of some hormone into a human may allow physiologic cooling to a low temperature by altering some phase of cell metabolism.” If he could ﬁnd this hormone, he believed, he would be able to arrest blood ﬂow without damage to brain tissue. At one point in his research, Bigelow was certain he had found the “hibernating chemical,” but it was merely contamination from plastic tubing. He admitted his mistake with good grace, and his curiosity in and hope of using hibernation as a vehicle to achieve hypothermia has survived in the work of contemporary researchers.
Interfering with Oxygen
Mark Roth, Ph.D., and his team at the Fred Hutchinson Cancer Research Center in Seattle have come close to achieving Bigelow’s dream. They recently discovered a way to induce and then reverse suspended animation in a kind of roundworm called a nematode. Moving on to bigger animals, they then induced a hibernation-like state in mice. In essence, they were able to engage an organism’s innate “metabolic ﬂexibility,” as Roth calls it, to depress the mice’s metabolism and consequently reduce their brains’ need for oxygen.
Instead of using “some hormone” that Bigelow hoped to ﬁnd, Roth reported in the April 2005 issue of Science magazine that he used two gases, both of which the body produces on its own: carbon monoxide and hydrogen sulﬁde. These gases are oxygen mimetics; that is, they resemble oxygen at the molecular level and bind to many of the same receptor sites. Because of this resemblance, they compete with oxygen and can interfere with the body’s ability to use oxygen to produce energy.
Through impeding an organism’s use of oxygen by exposing it to carbon monoxide, Roth and his colleagues were able to cast the nematode Caenorhabditis elegans into suspended animation. Suspended animation, according to Todd Nystul, a graduate student who assisted Roth during his experiments, means just that: no activity of any kind, no movement, no respiration, no cellular division.
C. elegans can enter suspended animation at any stage of its life, explained Nystul. If an embryonic worm is placed in an atmosphere with extremely low oxygen content—anoxia—it will turn off its metabolism for 24 hours or more, thus preserving itself from cellular destruction.
However, when the scientists placed C. elegans in a hypoxic environment, one in which oxygen levels are between 0.01 and 0.1 percent oxygen (far less than room air, but a bit higher than anoxia), they discovered that the embryonic worms do not enter suspended animation as they would under anoxic conditions. Instead, they attempt unsuccessfully to maintain normal metabolism and experience cellular death, a kind of burnout, within a day. When oxygen content was raised slightly to 0.5 percent, the nematode embryos developed as usual.
“Sometimes no oxygen is better than too little oxygen,” said Nystul. “There is a ﬂickering moment between no oxygen, or very low levels of oxygen, and enough oxygen. You get cell or brain damage when oxygen levels are too low for respiration but high enough to allow for metabolic activity. Cells continue to try to produce energy, and they burn out.” When oxygen levels are low, the oxidative phosphorylation that releases the energy most living organisms need to live can disrupt a cell’s proper functioning and release cascades of free radicals that damage DNA. What the research team discovered was that carbon monoxide, by blocking oxygen, impedes this process in the nematode.
When it came to continuing his experiments on mice, Roth put them not into a state of suspended animation but rather into a hibernation-like torpor, in which their metabolism and respiration were not stopped completely but merely slowed. As Roth wrote in the June 2005 issue of Scientiﬁc American, he essentially turned a warm-blooded creature into a cold-blooded one. The mice in his experiment breathed hydrogen sulﬁde, not carbon monoxide. Hydrogen sulﬁde is a toxic gas perhaps most familiar for giving rotten eggs their distinctive smell. Roth came up with the idea of using hydrogen sulﬁde while watching a television program on cave exploration. The show reported that the main risk in spelunking is encountering pockets of hydrogen sulﬁde, which in high concentrations can render a person unconscious.
In our bodies, hydrogen sulﬁde is released when cells break down proteins. The gas helps to maintain metabolic equilibrium, but it is also considered a poison (in high doses it kills by stopping cell metabolism). Roth postulated that early life on Earth may have used sulfur-containing molecules to generate energy in much the same way that life now uses oxygen. He also speculated that, as organisms made the transition to oxygen, hydrogen sulﬁde took on the role of oxygen’s primary antagonist. By interfering with the oxidative phosphorylation process, hydrogen sulﬁde may represent a protective mechanism in our bodies when we struggle to produce and use energy under conditions of too little oxygen. In other words, it might be a “natural trigger” to turn down our metabolism so that our bodies enter a state in which the demand for oxygen is reduced.
Like carbon monoxide, hydrogen sulﬁde is an oxygen mimetic and binds to cells that normally receive oxygen molecules. As Roth’s team reported in Science magazine, the mice were exposed to as much as 80 parts per million of hydrogen sulﬁde. At that level, their oxygen output dropped by half, and carbon dioxide production dropped by 60 percent within ﬁve minutes. Over the course of several hours, the mice’s metabolic rate dropped by 90 percent.
Their breathing was reduced from 120 breaths per minute to fewer than 10. Their core temperature dropped from 98.6°F to about 35.6°F above the ambient air temperature. The scientists were able to bring average body temperatures of the mice to as low as 59°F just by lowering the room temperature. This is exactly what happens to animals’ body temperatures when they hibernate. Instead of using ice blocks and fans, Roth was able to cool the mice by winding down their internal combustion, using a chemical that their bodies naturally produce. Even more important for potential human applications of the technique, after six hours in a state that resembled death, the mice were exposed to room air and brought back to normal neurological and physiological functioning.
Dying to Stay Alive
This new research roils some dearly held beliefs about where life and death begin and end, and what the nature of both is—much as the deﬁnition of “brain death” pressed us to ask whether death is a biological fact or a technological construct. Instead of using metabolic support and artiﬁcial breathing to extend life, as physicians do in an intensive care unit, Roth and his colleagues constrained life processes to mimic death. Yet it is precisely this death-like state that may one day save lives.
Like hibernation and states of suspended animation, hypothermia also blurs the border between life and death, allowing people to survive conditions that would normally kill them. According to John Tercier, M.D., Ph.D., a lecturer in sociology at Lancaster University in England, hypothermia, especially from cold-water drowning, is so effective at mimicking death that instances of it in the 18th century elevated fears of premature burial to almost hysterical proportions. More contemporary stories of people who survived states of severely reduced metabolic activity caused by cold have made their way into the popular media. In 1999, for example, a Norwegian skier, Anna Bagenholm, spent an hour under subzero waters. When found, her core body temperature was 57.2°F, and she was without a pulse or respiration. Yet after a nine-hour resuscitation, she made an excellent recovery. Two years later, 13-month-old Canadian toddler Erika Nordby slipped out of the house at night, wearing only a diaper and T-shirt. The temperature was -11°F, and, when she was found several hours later, her heart had stopped, but she survived with no neurological damage.
The sham death that someone in a hypothermic state presents is one reason for the distinction between absolute and living death. The American Heart Association recommends that core body temperature be raised to above 95°F before a patient is declared dead. In other words, a patient is dead only if he is warm and dead. Otherwise, he may be deep-frozen and well preserved.
The fantasy of suspended animation, cheating death by imitating it, has seduced not only researchers, but also writers for centuries. Friar Laurence gives Juliet a potion that sends “a cold and drowsy humour” through her veins. It causes all warmth and breath to leave her, but allows her to awake “as though from a pleasant sleep.” In the ﬁlm Coma, patients are exploited for their solid organs after being put into a suspended state induced by, presciently, carbon monoxide. In the 1979 ﬁlm Alien, the character Ripley enjoys a state of hypersleep to cope with space travel. Indeed, a little more than a year before Roth conducted his experiments, the European Space Agency announced its plan to induce hibernation in humans for prolonged space travel. “We’re not sure whether human hibernation is possible. But it’s not crazy,” remarked Marko Biggiogera, Ph.D., an Italian hibernation expert, to Nature magazine in August 2004.
Haitian voodoo straddles many categories: medicine and magic, science and sorcery. Central to its rituals is the creation of zombies, the living dead, who are, apparently, put to death only to be resurrected a short time later and coerced into slavery. The old Penal Code in Haiti describes a zombie as a person who was not killed, but “reduced to a state of lethargy, more or less prolonged.” This lethargy is, in appearance, a form of the torpor that Roth induced in mice.
So intriguing were the reports of hungan, or sorcerers, putting to death and reviving people that the ethnobotanist Wade Davis, Ph.D., traveled to Haiti in the 1980s to discover how their potion was administered and what its ingredients were. His motivation for learning about “voodoo death” was to ﬁnd out whether the potion’s ingredients might be applied to patients whose metabolism needed to be turned down. Davis discovered that its active ingredient—the one that causes the victim’s body temperature to drop and breathing to slow almost to the point of clinical death—is tetrodotoxin, a poison found in certain species of the puffer ﬁsh and that occasionally kills a consumer of fugu, still a treacherous delicacy in Japan.
As Davis described in The Serpent and the Rainbow, his sponsors, the psychiatrists Nathan Kline, M.D., and Heinz Lehman, M.D., wanted to know how the potion worked because they believed the beneﬁts to medical science would be incalculable. Kline remarked, “General anesthesia is essential, often unavoidable, always dangerous…If we could ﬁnd a new drug which made the patient utterly insensible to pain…and another which returned him to normal consciousness, it could revolutionize modern surgery.” General anesthesia is an unnatural state. It is also a dangerous one, an induced “central nervous system dysfunction,” as Roderic Eckenhoff, M.D., of the University of Pennsylvania puts it. How anesthesia launches patients into a kind of reversible coma is not entirely understood, and the search for better and safer agents has led to interest in induced hypothermia, which animal studies show reduces the concentration of volatile anesthetics. Hydrogen sulﬁde may provide a swifter route to either hypothermia or general anesthesia, although one must ask whether it will be any safer than traditional anesthetics, such as halothane or isoﬂurane, which anesthesiologists have been using for years.
Hibernation and Humans
The notion that one day a gas such as hydrogen sulﬁde might be used to cast a patient quickly into suspended animation is exciting news to physicians and hibernation biologists. Roth and his colleagues have little doubt that what happened to the mice in their experiments can also happen in humans. “We do share a very similar chemistry,” says Todd Nystul, and Roth’s Hutchinson team plans eventually to test hydrogen sulﬁde on dogs and monkeys.
The promise of translating hibernation’s beneﬁts into the human sphere is what drives many hibernation biologists, such as Hannah Carey. “It would be wonderful if we could use hydrogen sulﬁde or turn on genes to slow down reactions that result from ischemia [lack of adequate blood ﬂow],” she says. She and her colleagues at the University of Wisconsin-Madison have long been interested in hibernation because of its link to hypothermia, as well as the possibility of preserving organs for transplantation through mechanisms found in hibernating animals. A study she and her colleagues published in The American Journal of Physiology in 2005 showed that livers from hibernating ground squirrels are more resistant to damage than are those from summer squirrels or rats, regardless of whether the hibernators are fully torpid or aroused when the organs are harvested. Hibernation appears to protect the tissue from decay because the body’s metabolism has been slowed.
Experiments that biologist Kelly Drew, Ph.D., conducted at the University of Alaska showed that the brains of hibernating Arctic ground squirrels are more resistant to trauma than brains from nonhibernating squirrels. Tiny probes were inserted into the respective squirrels’ brain tissue. In the hibernating animal, a hole was made for the probe, but no other damage is visible. In the nonhibernating animals, signiﬁcant cell death and tissue deterioration can be seen.
Perhaps most interesting is a December 2004 study that provided the ﬁrst evidence ever of a primate—that is, one of our relatives—that hibernates. In this study, published in the Journal of Comparative Physiological Biology, researchers from Philips University in Marburg, Germany, studied the Madagascan fat-tailed dwarf lemur, which hibernates, surviving on the fat stores in its ample tail, in the tree holes of Madagascar for seven months in response to limited food supply, despite ambient temperatures rising above 86°F. With their metabolism almost at a standstill, the lemurs’ core body temperature follows the contours of daily heat, ﬂuctuating between 55°F and 86°F. That our distant relatives naturally enter torpid periods represents support for the belief that inducing such states in humans may not be unrealistic.
The mechanisms and effects of hibernation may, in fact, be latent in humans, and if Roth’s work proves viable, the implications are staggering. For example, it is difﬁcult to keep the brain tissue of someone who has just had a stroke oxygenated. A blood supply that is cut off, even for a short time, is often the reason for death after a stroke or heart attack. However, if suspended animation proves as expedient in humans as it is in mice, patients experiencing stroke or other trauma could be placed in suspended animation to prevent deterioration of their tissues. When it comes to organ transplantation, the medically acceptable interval might also be extended through the infusion of hydrogen sulﬁde. Currently, a kidney can remain viable for 24 hours before ischemic damage sets in. For livers, the window is 12 hours; for hearts, 4 hours. Roth imagines a time when, through his “de-oxygenation” process, organs might be preserved for days or even weeks.
Paths to Coolness
Spontaneous and uncontrolled hypothermia, such as that experienced by the Norwegian skier and Canadian toddler, forces metabolic processes to slow down, decreasing the cells’ need for oxygen and glucose. Respiration and heartbeat become imperceptible, but, because of decreased metabolic requirements, they perform well enough to preserve the brain’s tissues. Yet, although the cold can save, it can also have deleterious effects, such as shivering, constriction of blood vessels, and a deﬁcient immune response. Since the earliest medical uses of controlled cooling, physicians have sought techniques that adequately balance its risks and beneﬁts.
The ﬁrst clinical uses of hypothermia occurred in the 1940s. Neurosurgeon Temple Fay, M.D., was famous for ﬁlling tubs with ice and opening windows during winter to manage patients with arteriovenous malformations and intracerebral hemorrhage. He reported that recoveries were remarkable. Throughout the 1950s, patients were immersed in cold baths before undergoing cardiac procedures. Ice packs, cooling blankets, and the pumping of cold water into the stomach were all methods used to induce hypothermia.
By the mid-1960s, however, hypothermia’s popularity waned. Stephan Mayer, M.D., a neurosurgeon at the Columbia University Medical Center, attributed this to two problems. “First, severe complications resulted from hypothermia. Second, our technology for monitoring and providing support was relatively primitive,” he said. Patients would have clotting abnormalities or develop pneumonia or heart irregularities. Hospital stays were prolonged, increasing patients’ chances of contracting infections. Also, if a patient is to beneﬁt from cooling, cerebral metabolism needs to be slowed within 30 minutes. Decades ago, cooling took too long or required too great an effort to be worthwhile.
The medical community reconsidered induced hypothermia when the cardiopulmonary bypass allowed cardiac patients to be easily cooled and rewarmed, which led to almost routine use of hypothermia for many procedures—including circulatory arrest, which, for Mayer, provided incontrovertible evidence that hypothermia is neuroprotective. The second change that led to hypothermia’s renaissance occurred in the 1980s, when laboratories showed that mild hypothermia (reducing the body’s temperature to between 92.3°F and 94.1°F), not deep hypothermia, resulted in protection of the brain during both general and localized impaired blood ﬂow or trauma. Other studies at the time also showed that cooling after trauma could improve outcomes.
The neuroscientist Robert White, M.D., Ph.D, formerly of Case Western Reserve University, also demonstrated the beneﬁts of hypothermia in 1965 when he kept a rhesus monkey brain alive isolated from its body for 24 hours. In a speech delivered to the Pontiﬁcal Academy of Sciences in June 2004, White described how he wanted to “document the pure physiological, biochemical…and rheological [ﬂow] changes resulting from vascular cooling of the brain.” He found that by lowering the temperature of the isolated brain through a heat exchanger the organ could survive prolonged periods without blood. At 50°F, he wrote, survival is almost an hour.
Researchers at the Safar Center for Resuscitation Research at the University of Pittsburgh have used extreme hypothermia on dogs in an attempt to slow cellular damage. In 2005, they achieved mixed success at putting dogs into suspended animation. Cardiac arrest was induced in 14 dogs, and the blood was drained from them as a cold saline solution was poured through their veins, reducing core body temperature from 98.6°F to 44.6°F. The dogs stopped breathing and had no heartbeat for one hour. All the dogs were revived by a reinfusion of blood, and, 72 hours later, all were alive with little or no neurological damage. When the New York Post learned of the Safar Center’s work, it blared “It’s the ‘Night of the Living Dogs,’” but the Safar Center hopes to test these cooling techniques one day on humans. In emergency situations, removing blood from the body may be the only way to sustain tissues suffering obstructed blood ﬂow.
The New Hypothermia
Today, therapeutic hypothermia has both advocates and doubters, and ambiguity about its beneﬁts persists. In February 2001, for example, a multicenter trial that studied the effects of hypothermia on patients with traumatic brain injury was conducted by Guy Clifton, M.D., and his colleagues. The study concluded that, despite contradictory results from smaller clinical studies, therapeutic hypothermia applied within eight hours of injury produced no positive results. In fact, those patients who were rendered hypothermic had more deaths and hospital stays than those whose temperatures were normal.
The following year, two important studies conducted in Australia and Europe, respectively, were reported in The New England Journal of Medicine. They focused on the use of hypothermia as a neuroprotectant in cases not of traumatic brain injury but of cardiac arrest. Both groups of researchers used cold air and ice packs on patients who had experienced cardiac arrest and who were comatose by the time they reached the hospital. In the European study, led by Joseph M. Darby, M.D., patients were chilled to temperatures between 89.6°F and 93.2°F for 24 hours, whereas the Australian team, led by Stephen A. Bernard, cooled them to 91.4°F over 12 hours. Both studies concluded that patients who received hypothermic interventions recovered better and faster than patients who were resuscitated immediately. The studies showed that when oxygenated blood rushes back into the brain, the cascade of free radicals set loose from waste accumulation kills cells, as does inﬂammation. Cooling slowed both processes.
In 2003, a year after those two studies were published, the American Heart Association endorsed hypothermia as a neuroprotectant in comatose patients. In its journal Circulation, the Association recommended that all hospitals use hypothermic interventions when treating patients experiencing cardiac arrest. The endorsement dovetailed with an explosion of new cooling technologies and a growing market for them, into which investors have poured more than $200 million. A principal problem with early hypothermic interventions was that the entire body had to be cooled, even if one only wanted to save the brain’s tissues. To prevent shivering, patients were heavily sedated, and other bodily systems were put at risk from the cold. Methods now are faster and safer. A quick-chilling slurry has been developed by Ken Kasza, Ph.D., a researcher at the Argonne National Laboratory. The cool liquid, not yet used on humans, is drained into the lungs and, through chest compressions, circulated to the brain. The advantage of this cold “slushy” is that, like Roth’s induced hibernation, it acts quickly. Brain damage ensues after four to six minutes of ischemia, although the heart can be revitalized after as much as 20 minutes’ quiescence. The slurry pumped into the lungs could be used by paramedics to take advantage of the perilous moments between the time of a heart attack or stroke and arrival at the hospital.
One new technology is a portable canvas cooling helmet developed by Cool-Sytems, Inc., based in California, that would target just the brain. Another is a thermoelectric module (TEM), or “brain-chilling chip,” that was recently developed by scientists at Washington University in St. Louis. Implanted over the neurons responsible for seizures, the module’s sensors detect the onset of an attack and, like water tossed on a heated pan, immediately cool the neurons. Perhaps the most promising technological advances are endovascular, whereby the body is cooled from the inside out by inserting catheters, often into the inferior vena cava, through which cold saline is circulated throughout the body. This technique can induce mild hypothermia in as little as an hour and prevents shivering because the skin is not cooled.
Kenneth Hayes is the CEO of Radiant Medical, a private company founded in 1997 that develops cooling technology. “There needs to be a serious reconsideration of hypothermia because the technology and, therefore, the data are so new,” he says. “The old methods were pretty barbaric. Therapeutic hypothermia was associated with excessive bleeding and arrhythmias. It was also deep, not mild. When you go deep it gets dangerous. With catheters we have precision, control, and speed. You have to cool fast for hypothermia to work. We can do that now.” Radiant is still at the research phase with most of its products, and the market in general is still in its infancy, dominated by Radiant, Alsius (Latin for “cool”), and InnerCool. Hayes predicts that the big market will be for techniques to help manage heart attacks and strokes.
Neurosurgeon Stephan Mayer thinks that, although hospitals should adopt hypothermia programs, they will likely be slow to do so, because “it’s too different.” Some researchers are more skeptical. Michael Diringer, M.D., a professor of neurology at Washington University in St. Louis, argues that, although hypothermia is making great strides, its time has not yet come. “The new instruments make cooling faster, everyone is optimistic, studies suggest it works, but we don’t know everything about hypothermia. It works in cardiac arrest, but there is no solid evidence—no multicenter, randomized studies—that shows it works in stroke or traumatic brain injury,” says Diringer. He regularly gets calls from companies that want him to use their catheters, but he thinks the jury is still out on many potential applications of hypothermia.
Nonetheless, much cooling technology is still developing. The European and Australian studies provide sound evidence that comatose patients who have experienced cardiac arrest do beneﬁt from prolonged cooling. The keys to success are the speed and precision of that cooling. As investigators gain insights from hibernation studies, hypothermia could take its place as the intervention of choice to suspend metabolism, winning precious minutes or even hours of protection for desperately vulnerable brain tissue. If so, and if Roth’s mouse experiments with gas-induced hibernation translate to the human sphere, paramedics and emergency room physicians may rush to the side of patients who have suffered a stroke, heart attack, or brain trauma, clutching a canister of hydrogen sulﬁde.