Imagine That!
Neural Prosthetics Harness Thoughts to Control Computers and Robotics


by Brenda Patoine

January, 2005

Brain-machine interfaces. Microelectrodes implanted into the cortex to read neural signals. Robotic arms that respond to mere thoughts. Changing the television channel just by imagining that you are.

Welcome to the new era of neural prosthetics, pulled from the pages of science fiction and, after years of basic and animal research, landing in the real world of human clinical testing. The aim is high: restoring some degree of independence to people who are completely paralyzed or “locked-in” from injury or disease. Reaching that goal means overcoming a series of technical and medical challenges, but recent advances suggest that the possibilities may be as unlimited as the imagination.

At the conference, a dozen or more scientific groups in the vanguard of this young field offered a preview of what the future may hold. Many brought videos from their labs to demonstrate their progress, sometimes quite dramatically. Some showed monkeys using thoughts alone to reliably and quickly move cursors to targets on a computer screen, or to reach for food with a robotic arm. A team from the California Institute of Technology revealed how the very act of planning a movement could be harnessed to power a computer program. At the University of Lund, Sweden, researchers have developed a sophisticated prosthetic hand that closely mimics complex finger movements.

 

bw0205_1

Scientists are searching for ways in which neural prosthetics might help people with paralyzing injuries or disease. Alfred Pasieka/Science Photo Library 

The Power of Thought

In one of the most remarkable testaments to how far the field has come, Brown University neuroscientist John Donoghue presented video of a quadriplegic man checking his e-mail merely by thinking about it. The cursor obediently moved along the words as the man read the message, as if steered by some unseen hand that existed only in his mind.

The man is 25-year-old Matthew Nagle of Massachusetts, who was left completely paralyzed by a stab wound that severed his spinal cord three years ago. With the help of a brain-computer interface recording signals from his motor cortex, the part of the brain where movement is controlled, he was also able to control lights and a television, draw shapes on a computer screen, and play video games in real time, simply by imagining the movements.

How is this possible? The concept is fairly simple: record thoughts from the brain and translate them into actions on a computer (or, eventually, a prosthetic limb, a wheelchair, or other assistive device). The actual process, of course, is anything but simple.

In Nagle’s case, a chip the size of a baby aspirin was surgically implanted in his primary motor cortex, its 100 hair-thin microelectrodes projecting about a millimeter into the brain’s wrinkled outer layer. The electrodes capture minute electrical signals from motor neurons as Nagle imagines specific movements. Surgically attached to the top of his skull is an external interface, a bit smaller than a soup can, which transmits the neural signals to a processing station. There, the signals are converted to a program that controls cursor movement—up, down, left, or right—according to Nagle’s intentions. All of this happens instantaneously, allowing him to adjust or correct the movements as necessary.

Caution and Excitement

As of November, Nagle was the only person to have received the implant; another four are being recruited for the study, which has been approved by the Food and Drug Administration (FDA). Any study with only one subject needs to be interpreted cautiously, but the experiments demonstrate, as Donoghue says, “the feasibility of using direct neural commands to control useful devices. Imagined movement is functioning. The determined young man has become ever more proficient in his computer “games,” games that are enabling him to be less dependent on others for basic tasks of daily living. His hope is that he will one day be adequate.”

“I think it’s really exciting,” says Bill Heetderks, former head of the neural prosthetics program at the National Institute of Neurological Disorders and Stroke. Even though only one person has the device, “this is a very significant result, and it suggests that the groundwork was appropriately done.”

In particular, Heetderks is struck by the fact that Nagle was able to move the cursor while at the same time talking and attending to other things. “To me, this is very important in considering future uses of a device like this,” Heetderks says. “You don’t want to develop a communications interface that requires total concentration to use.”

bw0205_2

BrainGate captures brain signals, transmits them through an interface attached to the skull, and translates them into move-ment of a cursor on a computer screen. Courtesy of Cyberkinetics Neurotechnology Systems Inc.

In the case of the so-called “Brain-Gate” interface being tested in Nagle, the “groundwork” entailed more than a decade of basic scientific research on the activity of nerve cells in the primary motor cortex and how best to record their firing patterns. In a 2002 Nature paper, Donoghue’s group reported that two rhesus monkeys plugged into an early version of BrainGate successfully controlled robotic devices and played computer games with thoughts alone. The group has since refined the system and founded a company (Cyberkinetics Neurotechnology Systems Inc.) to test and commercialize it.

In June, neurosurgeons at Brown implanted the device into Nagle, and four months later, the system was still able to walk using prosthetic legs interfaced to his brain—something scientists say is still a long way off.

Evolution of a Field

While the BrainGate system may take neural prosthetics to a new level, it is not the first demonstration that smart micromachines can benefit human brains. Cochlear implants for the deaf are essentially microprocessors that translate sounds into nerve signals the brain’s auditory cortex can interpret. Retinal implants for blindness follow a similar logic, encoding visual signals from the environment into neural patterns the brain “sees.” California Institute of Technology neurobiologist Richard Andersen, who gave a special lecture on neural prosthetics at SfN, also cites deep brain stimulation, increasingly used to treat symptoms of Parkinson’s disease and dystonia, and functional neuromuscular stimulation, which uses focused electrical charges to contract specific muscles, as examples of this emerging field’s current applications.

[BrainGate] is not the first demonstration that smart micromachines can benefit human brains.

None of these devices, however, takes the leap that the latest efforts do: directly recording patterns generated in the brain and using them to reliably control external devices in real time. A few pioneering, if rudimentary, precursors to these efforts have captured the attention of scientists and the news media in recent years. Notably, Phillip Kennedy designed a cursor-control interface that was first used in 1998 in a man paralyzed from amyotrophic lateral sclerosis (Lou Gehrig’s disease). The device captured signals from a small number of neurons and was used with some success to enable the man to communicate—albeit laboriously and after much training—by pointing to letters or phrases on a screen. Kennedy’s company, Neural Signals Inc., says preliminary efficacy testing (FDA Phase II) of the system has been completed, and several patients have used the device, which they call the “Brain Communicator.”

For all their innovation, early attempts to harness neural signals have been less than ideal, requiring extended learning phases or intense concentration, or yielding results that beg for refinement. And, as is often the case when a technological advance makes the leap into human use, questions have been raised about whether more research was needed first. These questions linger today, even as the FDA gives its blessing to clinical studies.

“There is always this issue of when is it ready to go into humans,” Heetderks says. “If you didn’t hear that, something would be wrong. And if you waited until everyone agreed it was the right time, it would be a long time.” His view, which the FDA appears to share, is that the basic research and animal data have justified clinical testing.

What’s Next?

As the march to clinical applications continues, the techniques and technologies of the next generation of neural prosthetics are being fine-tuned in laboratories around the world. Among the primary aims of investigation are finding the best nerve cells from which to record, optimizing the recording techniques—implanted electrodes or brain-wave measurements (EEGs) from the surface of the scalp— and ensuring the long-term viability and safety of the devices. To a large degree, technological advances are driving the field, as microprocessors get smarter and prosthetic limbs get better.

Andersen’s group at the California Institute of Technology is focusing on cells in the “parietal reach region,” a high-level brain area at the intersection of the visual and motor cortices that directs the planning of movements, as opposed to their execution. This approach enabled monkeys, after brief training, to achieve about 70 percent accuracy in moving a cursor to various targets on a screen.

The team has developed a cutting-edge electrode array—what they call “a prosthesis lab on a chip”—that records firing patterns from hundreds of neurons at once. They have also shown that capturing “local field potentials” of groups of neurons produces a more “robust” signal that is easier to record and lasts longer than single cell recordings.

Andrew Schwartz’s group at the University of Pittsburgh reported on several monkey experiments in which they tested an interface that feeds signals from the motor cortex to a robotic arm. The team has successfully trained monkeys to reach the arm toward a specific target  (food, in these cases). Miguel Nicolelis and colleagues at Duke University reported similar promising results in 2003, also in monkeys using a robotic arm powered by thought.

Taken as a whole, the animal research shows that monkeys, if properly trained, can achieve a remarkable level of control over simple cursor movements. The Donoghue work shows that such control is also possible in humans. As the field matures, the ability to accomplish bidirectional communication directly from nerve cells to external devices opens a world of possibilities, as Andersen noted in his lecture: “In the future, neural prosthetics could, at least in theory, be used to read out speech thoughts of the mute, or could even be implanted in emotional areas of the brain to read out emotions.”