Paralyzed from head to toe, the patient, his mind intact, is imprisoned inside his own body, unable to speak or move,” Jean-Dominique Bauby wrote after suffering a brain-stem stroke in 1995. “In my case, blinking my left eyelid is my only means of communication.”
Bauby, who compared his state to being trapped within a deep-sea “diving bell,” died of heart failure in 1997, shortly after his book (now a film), “The Diving Bell and the Butterfly,” was published. Had he lived some years longer, new technologies based on brain-machine interfaces could have helped him to escape his imprisonment— and not just through flights of imagination.
During the past decade, researchers have developed thought-controlled limbs, advanced cochlear implants, and even an electrode array that feeds a digital camera’s output into the visual cortex of a blind person’s brain. Arguably the most ambitious of today’s brain-machine devices is one that aims specifically to help locked-in patients like Bauby by converting their inner thoughts to real-time synthesized speech. But the effort to develop this electronic “speech prosthesis” shows just how difficult it can be to meld mind with metal.
Finding the Right Neurons
The system requires the implantation of a tiny, sensitive electrode in what is known as the premotor speech area of a patient’s brain. “We concentrate on the articulating area that controls the tongue, the jaw and the lip,” explains Philip Kennedy, the Atlanta-area neurologist and neuroscientist who heads the speech prosthesis project and started a company called Neural Signals Inc. in 1987.
Widely regarded as a pioneer in the neural implant field, Kennedy has been designing and developing the devices since the 1980s. He received a Discover Award from Discover Magazine in 1999 for an earlier device that allowed locked-in patients to move a cursor on a computer screen with their thoughts. But the speech prosthesis is his most challenging project yet.
To begin with, the motor neurons that normally become active with speech are not in precisely the same place from patient to patient and have to be located with a functional magnetic resonance imaging scan before the electrode is implanted. In late 2004 Kennedy performed the scan on his first speech-prosthesis subject, Erik Ramsey, a locked-in patient who had suffered a brain-stem stroke after a car accident at age 16.
Ramsey had to lie inside the MRI machine looking at pictures of animals, mentally trying to vocalize phrases such as “this is a dog,” to make his speechassociated neurons light up for the scanner. If he moved his head a fraction of an inch, the scans had to be done again.
Implanting the Electrode
When the densest concentration of relevant neurons had been located,Kennedy and a neurosurgeon colleague performed the extraordinarily delicate task of implanting the electrode assembly. The signal amplifier and transmitter had to be screwed onto the top of Ramsey’s skull, beneath his scalp—but the electrode’s business end went right into the brain.
“We place it down at an angle of 45 degrees, five or six millimeters deep,” Kennedy explains. Too shallow or too deep and it would have missed its target neurons. And it was “absolutely essential,” Kennedy says, that the tiny, fragile gold wires that run into the electrode tip were gently coiled into flexible springs, so that their signal-gathering ends would stay in place among the targeted neurons whenever the brain moved relative to the skull.
The electrode tip was not a simple, naked piece of metal. If it had been, says Kennedy, “the brain would try to get rid of it by forming glial tissue around it, and the signal would be lost, and we’d have to take it out.” Kennedy notes that this is what tends to happen with most other electrodes used in neural implants today. By contrast, he says, “I need at least a fifty-year survival of the electrode.”
In Kennedy’s electrode design, the three gold wires that conduct signals from the brain terminate inside the tiny, cone-shaped tip of a glass micropipette. About a millimeter long, this glass sheath has a lower opening designed to admit axons, which transmit signals between neurons.
To stimulate the local growth of these processes, Kennedy first had to gently wound the implantation area with a tiny blade and forceps (“in a controlled way,” he explains). He also filled the glass electrode casing with a proprietary mix of nerve growth factors— proteins that spur nerve cell growth—to encourage axons to grow into the electrode, thereby holding the device snugly in place and also bringing the neural signals inside, close to the electrode wires.
Gathering and Processing Data
After verifying and fine-tuning the raw signals, Kennedy and his team began the long process of data gathering. In this process they played sounds for Ramsey, who was asked to try to repeat them aloud in his mind while neural recordings were taken via the electrode.
It was a tedious, painstaking process. Like many locked-in patients, Ramsey can communicate solely with his eyelids. He sees poorly because he lacks the ability to coordinate his eye movements, and he tires easily. Kennedy and his colleagues questioned him frequently about his alertness. They also could never be certain that Ramsey actually tried to say the sound as requested—“it was a huge variable,” Kennedy admits.
The hardest challenge of all in Kennedy’s project has been to interpret the noisy firings of Ramsey’s neurons as intelligible speech. First individual neurons’ signals have to be separated from one another—and in Ramsey’s case 41 speech-relevant neurons are within listening range of the electrode. Then the pattern of neuron firings has to be translated into appropriate sounds. “This requires some very sophisticated signals processing techniques,” notes Frank Guenther, an associate professor of cognitive and neural systems at Boston University.
One of several academic experts around the country who is helping Kennedy to develop the speech prosthesis system, Guenther points out that the job is not to decode the neuron firings perfectly, but to do it in a way that allows the patient to “tune” the prosthesis quickly as it plays back the sounds he is trying to vocalize. Previous experiments with neural prostheses in humans and monkeys have shown that feedback can allow rapid learning.
With graduate student Jonathan Brumberg, Guenther has developed a decoding algorithm based on “formant frequencies,” the main audio frequencies that make up spoken sounds. Guenther’s decoder so far is able to take a given neuron firing pattern from Ramsey and interpret it as a simple combination of two formant frequencies— usually enough to define a vowel sound, with the advantage that each frequency will be continuously variable.
“So he’ll be able effectively to steer this around,” Guenther says. “Like aahhh, ooohhhh—he’ll be trying to move between different vowels. He’ll be able to tune the system up.”
Still a Long Road Ahead
In late January or early February Ramsey will begin practicing with feedback from Guenther’s decoder. Guenther is optimistic. But the system is being tested only on vowel sounds at the moment. The more difficult consonants will come later.
Jim Rebesco, a neuroscience Ph.D. candidate at Northwestern University who has been working on an alternative decoding method, notes that so far “Erik’s neurons tend to fire only a couple of times a second, which is slower than for natural speech.” It thus could be hard, without a lot of practice, for Ramsey to modulate his “speech” in real-time, in response to feedback.
Kennedy doesn’t expect to have the system working for the full range of Ramsey’s speech much before 2010. He hopes to implant an electrode in at least one more patient in the next year, but “to make it available on a widespread basis is probably going to take five years,” he says.
Those who could benefit from such a prosthesis include not only locked-in patients but also those with slowly progressing forms of amyotrophic lateral sclerosis, or ALS. Among the latter is physicist Stephen Hawking, who is unable to communicate in real time and can only select letters or words slowly on a computer screen using a joystick he controls with slight movements of one hand.
The work of Kennedy’s team could speed the development of other prosthetic systems, for a wider range of paralyzed patients. “The decoding algorithms should be useful in any motor task, whether speech or movement,” he says.
Kennedy’s special cone electrode also could see wider use. Motor functions are typically controlled by coordinated but separate groups of neurons, sometimes inches apart in the brain, Guenther points out: “I think the best approach will be to use maybe four of these cone electrodes, like Dr. Kennedy has designed, in different locations in the motor cortex, for example, rather than using a large array.”
Other groups have favored electrode arrays to gather signals from a greater number of neurons, but most electrode arrays have proved to be relatively short-lived. “If it’s a solid array of electrodes, it’s going to move around in the cortex,” explains Guenther. “As those arrays move around they cause scar tissue—“gliosis”—to arise, and that kills the neural signals eventually.”
William Reichert, a professor of biomedical engineering at Duke, agrees that the gliosis problem continues to plague neural prosthesis designers. “You have to figure out a way to integrate the electrode with the tissue,” he says. And although he compliments Kennedy’s work, he cautions that how to integrate electrodes with brain tissue with enough reliability for routine clinical use remains unresolved.
How long will it take to develop the speech prosthesis so that it’s a reliable, off-the-shelf system that seamlessly replaces ordinary speech? “Maybe 20 years,” sighs Kennedy, who already has been working that long on neural interface technology.
In the near term, he emphasizes, his goal is just to get the speech prosthesis technology into as many locked-in patients as possible. There are no hard data on the prevalence of the condition, but he believes that there are probably tens of thousands of people like Erik Ramsey in the United States alone.
“They’re still out there,” Kennedy says. “They’re just put in a nursing home or kept at home and cared for by their spouses or parents. And there’s no association for them, there’s no ‘Jerry’s Kids’ for locked-in patients. There’s nothing like that.”