A new technique for recording brain activity for the purpose of controlling a computer or mechanical device—known to scientists as “brain-computer interface”—can, for the first time, distinguish fine motor activity down to individual finger movement, scientists report.
Their report arrives amid several other advancements in neural interfaces, suggesting to some researchers that advanced clinical applications are on the horizon. Other new research in monkeys could one day yield freedom for mobility-impaired patients with amyotrophic lateral sclerosis (ALS) and other brain disorders.
“It’s exciting in terms of thinking about potential new strategies for treating new areas of the brain,” Daryl Kipke, a biomedical engineer at the University of Michigan and director of the Center for Neural Communication Technology, said Nov. 17 at a symposium on neural interface technology during the Society for Neuroscience annual meeting.
“We’re creating new lines of information in the brain,” he said.
Communicating with neurons
Neural interface technology encompasses sensors that record brain activity for the purpose of interpreting a person’s intent, and techniques such as deep brain stimulation (DBS) that alter brain activity to help treat people with various neurological conditions.
When recording cell activity with brain-computer interface (BCI) technology, researchers acquire neural signals using implants surgically placed inside or on the surface of the brain, or else by electroencephalograpy (EEG). EEG records electrical activity from the brain via several sensors attached to a cap-like device placed on the scalp. Unlike implants, EEG requires no surgery.
Once the neural activity is recorded, researchers must extract from the cell activity those features that reflect the person or animal’s actual or intended action, such as moving an arm muscle. A computer then processes those features and translates them into commands for a device such as a computer cursor or mechanical limb. BCI has enabled monkeys to feed themselves with robotic arms and patients with ALS to control a word processor using only their thoughts.
In clinical settings, deep brain stimulation can control some movement-related symptoms of essential tremor, Parkinson’s disease and dystonia. For patients who do not respond to other therapies, surgeons implant electrodes to deliver a low-level current to specific brain areas.
A new way to read cells
A recently developed neural recording device, less invasive than those used in DBS or those implants commonly used in animal BCI research, allows for surprisingly acute data, according to new research led by Gerwin Schalk, a researcher at the Wadsworth Center, the research arm of the New York State Department of Health. The new BCI device has some of the advantages of both cortical implants and EEG, with fewer drawbacks.
Electrocorticography (ECoG) involves placing an array of electrodes directly on the surface of the brain in a surgical procedure usually performed on patients already on the operating table for surgeries related to brain tumors or epilepsy.
While DBS has proved helpful in its approved clinical applications, inserting the device inside cortical tissue remains a risky procedure. Similarly, brain-computer interfaces that depend on implanted electrodes can lead to complications, making approval for clinical use difficult. As a result, BCI implants have been used in only a handful of human subjects. Surgically placing an ECoG device on the surface of the brain is less risky.
In addition, although signals from implants, which record individual neuron signals (called action potentials), have “very high fidelity”—providing solid, clear data—they also have “serious problems with robustness,” or maintaining a strong signal over time, Schalk said. BCI implants often do not maintain a good signal over long periods of time because of mechanical problems or reactive responses from the brain, each of which seems to occur less frequently with ECoG implants.
The other common BCI technology, EEG, is non-invasive but limited by low fidelity and robustness. Because the sensors must read brain signals through the skull, the signal-to-noise ratio is much lower than that of implants. “There have been some impressive demonstrations in [the] real world,” Schalk said. “But at the same time, EEGs have serious problems with robustness.”
First tested in humans in 2004, ECoG devices have a better signal-to-noise ratio than EEG recordings, thanks to closer proximity to the cells themselves. Schalk believes that the fidelity of the signals compares to those detected by neural implants with regard to measuring muscle movement.
“We found that with these [ECoG] signals, even though they’re not implanted into the cortex and don’t record from individual action potentials, they have a tremendous amount of information in areas relevant to brain interfaces,” he said.
Schalk and colleagues recorded areas in the motor cortex of the human brain and, by processing these signals to extract key features, discovered patterns that paralleled the movements of individual fingers during hand movements. “We can reconstruct hand-moving actions with the same fidelity as implanted electrodes,” he said.
In fact, the study represents the first continuous decoding of all individual fingers in any species of animal—for any BCI technology. Previous studies had been able to identify in monkeys which finger moved, but not detailed information about finger position over time. The researchers have submitted the study to the Journal of Neural Engineering.
Eberhard Fetz, a BCI researcher at the University of Washington, called the findings interesting. “It goes to the issue of whether one can extract useful signals from less invasive [techniques],” he said.
Direct muscle control
In another recent first in BCI technology, Fetz and colleagues showed that monkeys, while temporarily paralyzed so as to represent patients with nervous system injury, can control wrist movement with brain activity that causes electrodes to artificially stimulate the muscles.
In a study reported in the Oct. 15 issue of Nature and presented by Fetz at the symposium, the researchers recorded signals from primary motor cortex cells in macaque monkeys using a neural implant. The researchers taught the monkeys to control cursor movement on a computer screen using movements of the wrist while recording which neurons became active.
Then the researchers used a chemical to temporarily paralyze wrist movement, and the monkeys were faced with the same computer task. But this time the recorded signals were fed to electrodes implanted in the wrist muscles, a technique called functional electrical stimulation (FES). With time, the monkeys learned to artificially control simple wrist movements through recorded cell activity.
The study “speaks to the concept of creating connections between different areas of the brain [in] real time,” said Daryl Kipke, the Michigan biomedical engineer and also the symposium chairman.
The “recurrent brain computer interface,” as the system is called, has not yet moved to the realm of human studies. One limitation is the invasive nature of recording and stimulation. “In humans you would want to use something less invasive, like surface recordings,” Fetz said.
Although past research has focused on enabling a patient to check e-mail or control a word processor with either implants or EEG, studies such as Schalk’s and Fetz’s could trigger additional research into FES and other sophisticated technologies.
“I expect that ... in the next 5 to 10 years we will see, with whatever signal platform, ... neurally controlled devices that will become widespread [among] people with disabilities,” Schalk said.