Progress Report 2009: Brain-machine Interfaces
Sci-fi Concepts Make Clinical Inroads


by Brenda Patoine

January, 2009

A woman in a Boston suburb, “locked in” from a brain-stem stroke for more than a decade, checks her e-mail using thoughts alone to command the computer. In a laboratory in Pittsburgh, macaque monkeys learn to feed themselves marshmallows using a thought-controlled prosthetic arm, adapting and refining its movements as if it were their own appendage. In North Carolina, researchers capture the thoughts of a twelve-pound monkey and transmit them to Japan via high-speed Internet to make a twohundred- pound humanlike robot walk, apparently the first instance on record of such remote transmission.

Thought-controlled robotics and computers commanded by neural activity recorded directly from the cortex are no longer strictly the province of science fiction; they are here and now. Recent advances have propelled the young field of “neural prosthetics” forward. As researchers work on the details required to take “brain-computer interfaces” into clinical practice, they have an ambitious vision: to restore mobility or communication to patients with severe neurological damage from brain disease or injury.

To date, four human patients have been fitted with brain-computer interfaces—also variously called neural prosthetics, neural interfaces, or brain-machine interfaces. All four have participated in a pilot clinical trial investigating a neural interface called BrainGate. The system is being developed by Cyberkinetics, Inc., a small company founded by Brown University neuroscientist John P. Donoghue.

PR09_CH04_MechArm_spotlight

Scientists have shown that brain waves can move a robotic arm purposefully. In a study testing this concept, macaque monkeys learned to perform a self-feeding task. (Andrew B. Schwartz, Ph.D. / University of Pittsburgh)

Leigh R. Hochberg, a neurologist at Massachusetts General Hospital/Harvard Medical School and the trial’s principal investigator, said in August that one person is currently enrolled, a 54-year-old woman who suffered a stroke in the brain stem twelve years ago, which left her immobilized from the neck down and unable to speak. Using the BrainGate interface, the woman, whose name has not been revealed publicly, has learned to control a computer cursor to conduct rudimentary functions, including opening her e-mail and turning on a television or lights. She has also successfully moved an electronic wheelchair, albeit not while in the chair, using only her intention to do so.

Hochberg said his team worked with her over a thirty-month period in weekly sessions lasting up to eight hours each. Even early in the process, he said, she had achieved “fairly reliable and rapid control of the cursor.” The latest results from the trial were presented at the 2008 Society for Neuroscience meeting in November. “There are some things that we can only learn through regular feedback from the patient,” Hochberg said. “This woman is teaching us a lot.”

Of the three other patients who have been enrolled in the BrainGate trial, two had the device removed after a year—an option that is built into the trial. Both were quadriplegic (paralyzed in all four limbs) following spinal cord injuries at the cervical level. A fourth patient, who had amyotrophic lateral sclerosis (ALS), died after ten months with the interface, due to a ventilator problem apparently unrelated to the brain prosthetic, Hochberg said.

Proof of Principle

Studies in patients with severe disabilities have provided proof of principle that neural interfaces can work, even in people who have been completely immobilized for many years. But the BrainGate interface, while far and away the farthest along in clinical development, is a long way from the ideal. Even its lead developer, Donoghue, has conceded that the system as it exists now is cumbersome and impractical—primarily because of the large banks of equipment necessary to decode the neural signals into useful commands and the thick wad of cables tethered to the patient’s skull that links the interface to the equipment. Still, by all accounts it is an impressive demonstration of what is possible.

“It’s important to remember that these are individuals who are otherwise locked in,” said William Heetderks, a National Institutes of Health scientist who headed the federal government’s neural prosthetics research program for many years. “The neurons being recorded from have not been used to generate movement for many years, in most cases. When you think of it that way, it’s kind of amazing that this works at all.”

The first results of the BrainGate trial, from a 25-year-old quadriplegic man who has since had the device removed, were reported at the neuroscience meeting in 2004 and later published in Nature.1 When the early results were made public, many people balked at the notion of implanting electrodes in the cortex and questioned whether the technology had been adequately studied in animals before human experiments commenced. Such questions persist.

PR09_CH04_MonkeyFeed_spotlight

A monkey feeds itself during a three-dimensional brain control task. In this experiment the monkey controlled the velocity of the arm’s endpoint. In subsequent experiments, the monkey also controlled the opening and closing of the arm’s grip. (Andrew B. Schwartz, Ph.D. / University of Pittsburgh)

“There will always be this issue of when is it ready to go into humans,” Heetderks said at the time. “If you waited until everyone agreed the time was right, it wouldn’t happen for a long time. I think the data supporting putting electrodes into humans was strong and fairly convincing in terms of being safe and realistically having a very good chance of working. Obviously, the FDA was convinced,” he added, since the agency approved the trial.

Monkey Think, Monkey Do

As the BrainGate trial continues to enroll patients, other researchers in the forefront of the field are focused on ever more elaborate demonstrations of the utility of neural prosthetics in nonhuman primates. One of the latest reports, published online in Nature in May 2008, came from the University of Pittsburgh laboratory of Andrew B. Schwartz, one of the field’s pioneers.2 In what the New York Times called “the most striking demonstration to date of brain-machine interface technology,” Schwartz’s group trained two monkeys to control an advanced anthropomorphic robotic arm, complete with shoulder and elbow joints and a clawlike gripper serving as a hand.3

The researchers had implanted into the monkeys’ primary motor cortex the same kind of electrode grid used in the BrainGate trial, which records signals from about one hundred neurons in the motor cortex. They then trained the animals to generate patterns of brain activity to move the robotic arm in three dimensions. With their own arms gently restrained and the prosthesis positioned near their shoulder, the monkeys learned, after only a few days, to smoothly and naturally reach the arm out to a piece of fruit or a marshmallow, open the gripper and remove the treat from its peg, return the treat to their mouths, and open the gripper to release the food. Even more remarkably, the monkeys learned to adapt the movements of the robotic arm, adjusting its movement, for example, when a marshmallow became stuck on the gripper or when the researchers changed the location of the treat.

In an editorial accompanying the Nature paper, John F. Kalaska, a neurophysiologist at the University of Montreal, called the work “the first reported demonstration of the use of [brain-machine interface] technology by subjects to perform a practical behavioral act—feeding themselves,” and said it represents the “current state of the art in the development of neuroprosthetic controllers.”4

‘Real-World’ Demonstration

To Schwartz, the primary significance of the report is that “the animal is working in the real world. Up until now, there have just been demonstrations of subjects moving a cursor on a computer screen. Our research shows that this can work in three dimensions, not just two.” (Three-dimensional movement would enable a patient to have far greater capacity for performing basic functions of daily living, such as eating.)

The most recent reports build on previous research in primates, including a 2003 paper by Miguel Nicolelis’s laboratory at Duke University demonstrating that a monkey could be taught to use brain activity to move a disembodied, rudimentary robotic arm, the first published report of a primate reaching and grasping with a robotic arm via a neural interface.5

In work not yet published but reported in January 2008 by the New York Times, Nicolelis’s group demonstrated for the first time that a monkey’s brain signals could make a robot located on the other side of the world walk.6 The researchers recorded the monkey’s neural signals as it walked on a treadmill, then decoded the signals and transmitted them to a laboratory in Japan, where they were fed into an advanced robot that mimics human locomotion. When the monkey walked, so did the robot, and the monkey was able to see the effects on a large-screen video monitor placed in front of her, providing critical visual feedback. The robot’s movement precisely mimicked the monkey’s, according to the news report. After an hour or so, the researchers stopped the monkey’s treadmill. The robot kept walking. Apparently, a subset of the monkey’s neurons had “adopted” the robot’s leg movements as the monkey’s own movements, and encouraged by tasty rewards, the monkey had learned to keep the robot moving.

PR09_CH04_Nicolelis_spotlight

Miguel Nicolelis and colleagues at Duke University taught monkeys to use brain signals to control the movements of a robot on the other side of the world. The researchers trained some of the monkey’s neurons to “adopt” the machine’s locomotion as its own. (Miguel Nicolelis, M.D., Ph.D. / Duke University)

Fantastical as it may seem, the concept behind such demonstrations is fairly straightforward: record electrical activity from the right set of neurons and translate the signals, via sophisticated mathematical algorithms, to power external devices—be it a computer cursor, a prosthetic limb, or a wheelchair. In reality, of course, executing the concept is anything but simple. It has taken decades of concerted effort by a small but growing group of researchers to advance neural interface technology from a pie-in-the-sky concept to here-and-now reality.

The NIH neural prosthetics program, now in its fortieth year, nursed the young field along by funding many of the research groups now in the forefront. A piece of this program was aimed at doing precisely what has now been done: use neural signals recorded from the cortex to control an external device.

“When we started this, we asked what would constitute a minimum demonstration of feasibility,” Heetderks said. “We decided that one-dimensional control was a reasonable place to start.” The most recent results by Schwartz’s group “are more elegant than we had imagined,” Heetderks said, because they extend movement into three-dimensional space, which allows natural, fluid movement.

Basic Research Laid Groundwork

The latest advances build on decades of basic science research aimed at unraveling the function of the brain’s motor cortex, where movements are initiated and carried out, and relating specific neuronal populations to movement direction and velocity. In the early 1970s, Eberhard Fetz, now at the University of Washington, Seattle, and colleagues demonstrated that monkeys could be trained to increase or decrease their cortical activity to move a device akin to a radio dial up or down, work that Hochberg called “instrumental” in proving the principles behind today’s neural prosthetics. In addition, basic research in the 1980s by Schwartz and his mentor, Apostolos Georgopoulos, a cognitive neuroscientist now at the University of Minnesota, had demonstrated that it should be possible to get “good three-dimensional control” of external devices by recording from as few as fifty or sixty neurons in the motor cortex, Heetderks said.

Largely as a result of the strength of such basic research, Heetderks said, “There was never really any question that this was possible, in principle.” Rather, the question was “how much information do you have to pull out of the brain to make precise movements?”

The advent of cochlear implants for the hearing-impaired, first developed in the 1960s and used today by more than 100,000 people, has provided critical proof of principle that it is possible to alter sensory function by targeting a relatively small number of neurons. The devices are composed of a tiny external headpiece and processor that pick up sound waves and convert them into digital signals that are then transmitted through the skin to the implant, which is attached to the skull inside the ear. The signals activate electrodes within the cochlea, a critical organ in the brain’s hearing machinery, to stimulate neurons connected to dysfunctional inner-ear cells called “hair” cells, which normally transmit sounds to the brain.

“It is difficult to overestimate the importance of the cochlear implant to the development of neural prosthetics,” said Heetderks, pointing out that early versions of the cochlear implant used only four stimulating electrodes, while modern models use about twenty. “Thirty years ago, I would have said it won’t work: how could you possibly represent all the richness of sound with just a few stimulating electrodes?

“The fact that it worked shows how remarkable the brain is at interpreting scrambled information,” Heetderks added. “It made believers out of doubters.”

Meeting the Challenges Ahead

As the young field of neural prosthetics marches forward, several key challenges remain. In particular, experts cite the need to improve the long-term reliability of the implanted electrode arrays that are used to record neural signals. The current participant in the BrainGate trial has had her implant in place for nearly three years, and while it has continued to operate throughout that period, the researchers have seen fluctuations of unknown cause in the richness of the signals, according to Hochberg. Researchers are also concerned about the “foreign-body response” of immune cells to electrodes chronically implanted in the brain, he said. “It’s clear that we have to improve the recording stability, either by changing the material or the surgical methods used for implanting the electrodes.”

Richard A. Andersen, a neurobiologist at the California Institute of Technology with expertise in optimizing signal-recording methods for neural prosthetics, doesn’t think that “recording longevity,” as he terms it, is a significant limitation for the field. “If the electrodes are well-made and durable, it appears that they can last for years,” he said.

Researchers also hope to make the neural interface systems more practical for human use. Development of a wireless interface is critical, to obviate the need for cables running from the brain implant to the decoding hardware. The challenge there, according to Andersen, is to develop an implant that has a built-in power source and is sufficiently low-powered that it will not heat up brain tissue, which could cause serious problems. A number of groups are working to overcome this hurdle, which requires developing a fully implantable electrode array with integrated electronics, a power source, and high-resolution signal transmission. Donoghue’s group at Brown, for example, is developing a fully implantable neuromotor prosthetic “microsystem-on-a-chip,” which incorporates advanced ultra-low power microelectronic circuits and processors. Fiber-optic technologies are also being exploited as a means to provide both a power source and efficient signal transmission.

Scientists continue to debate which neural signals to capture to achieve the best results. The BrainGate system and the systems used by Schwartz and Nicolelis in the most recent advances target a discrete population of neurons in the motor cortex—an appealing target because of the long history of solid basic research delineating the precise actions of these cortical motor neurons. But other targets may afford advantages also. Andersen’s team has focused on a part of the parietal cortex involved in reaching movements.

Recording from neurons in this “reach region” of the parietal cortex provides higher-level signals related to the goals of movements—the stage just before a motor command is issued. Andersen said targeting these signals has the advantage of enabling the decoding of a wide variety of cognitive signals, which opens new possibilities for using neural prosthetics for a range of cognitive functions well beyond movement. For example, it might be possible to record thoughts from speech areas to facilitate communication in locked-in patients. In the distant future, electrodes may record from multiple regions of the cortex to drive a potentially unlimited range of cognitive functions.

Despite the challenges, researchers in the trenches of this field are optimistic. “I don’t see any major insurmountable challenges to moving the field forward,” said Andersen. Schwartz echoes this sentiment: “Most of the challenges that remain are not earth-shattering, but rather mundane problems. There is nothing here that a concerted effort can’t address.”

A concerted effort to take neural prosthetics to the next level is precisely what the Defense Advanced Research Projects Agency (DARPA) has in mind with its “Revolutionizing Prosthetics” initiative. With a budget of nearly $50 million over six years, the military-funded project focuses multiple research centers on the goal of producing an advanced neural-controlled prosthetic arm that allows the user the full function and capability of a normal human arm—ideally to allow the user enough fine-motor control to thread a needle or play a piano. DARPA expects to begin clinical trials to test the brain-to-arm interface system in 2009.

Notes

1. Hochberg LR, Serruya MD, Friehs GM, Mukand JA, Saleh M, Caplan AH, Branner A, Chen D, Penn RD, and Donoghue JD. Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature 2006 442(13):164–170.

2. Velliste M, Perel S, Spalding MC, Whitford AS, and Schwartz AB. Cortical control of a prosthetic arm for self-feeding. Nature 2008 453(7198):1098–1101.

3. Carey B. Monkeys think, moving artificial arm as own. New York Times, May 29, 2008.

4. Kalaska JF. News and views: Brain control of a helping hand. Nature 2008 453:994–995.

5. Carmena JM, Lebedev MA, Crist RE, O'Doherty JE, Santucci DM, Dimitrov DF, Patil PG, Henriquez CS, and Nicolelis MA. Learning to control a brain-machine interface for reaching and grasping by primates. Public Library of Science–Biology 2003 1(2):E42.

6. Blakeslee S. Monkey’s thoughts propel robot, a step that may help humans. New York Times, January 15, 2008.