Predicting Risk, Targeting Attention

user
Author:
Carl Sherman
Published:
November 7, 2016

Mental illness, the leading cause of disability in the western world, has long been a principal focus of neuroscience research. A symposium presented by the New York Academy of Sciences offered a progress report for disorders ranging from autism to depression, anxiety to anorexia.

A theme shared by a number of presentations was the growing ability to see through patterns of thought, behavior, and emotion characterizing these diseases to their underlying neurobiology. Advances in technology were part of the story—not only in the laboratory, but in day-to-day interactions with patients, via mobile and sensor applications.

While the hope of new interventions to help the mentally ill is a persistent subtext of research, Husseini Manji, of Janssen Research & Development, suggested an even more ambitious goal: “There is reason to be optimistic about changing the paradigm from ‘diagnose and treat,’ to ‘predict and preempt,’” he said. (Janssen Neuroscience, a pharmaceutical company, was a principal sponsor of the symposium.) 

Getting a jump on psychosis

Prediction was at the heart of Tyrone D. Cannon’s presentation on psychotic disorders, with the tantalizing prospect of prevention hovering just outside the frame.

Schizophrenia and related disorders typically appear in the late teens and early 20s, and several decades of research have identified a prodrome (early symptoms suggesting onset of a disease) that indicates high risk. Among people who show such symptoms as declining social and academic function and psychosis-like changes in thought and perception, 16 percent will develop psychosis within a year, 25 percent within two years.

Cannon, director of the Clinical Neuroscience Lab at Yale University, described research to sharpen predictive power. With an algorithm that integrated unusual thought content and suspiciousness, age, and neurocognitive variables like poor verbal learning and speed of processing, researchers with the North America Prodrome Longitudinal Study (NAPLS), a multiuniversity consortium, developed a risk calculator whose accuracy compares favorably with algorithms used to predict heart attack and cancer risk.

“There’s still room for improvement,” he said. “What if we could integrate biomarker data?”

NAPLS researchers first looked at brain structure.  A small study suggested that baseline MRI could predict conversion to psychosis among high-risk youth as accurately as the demographic-neurocognitive variables used earlier.

Pursuing another line of inquiry, they analyzed blood samples. An index combining markers of inflammation, oxidative stress, and hypothalamic-pituitary-adrenal activation was  substantially more predictive than the original algorithm.

“These are small studies, in need of replication in large samples,” Cannon said.   This research is under way, with four large research groups pooling brain and blood data.

To be truly useful, more accurate risk prediction would demand effective preventive strategies now lacking. “We need insight into the mechanism of onset to find the right targets,” he said.

Cannon summarized research suggesting that synaptic pruning—a normal feature of adolescent brain development—is amplified in psychotic disorders, resulting in excessive cortical thinning. “What we think is happening is that microglia [immune cells in the brain] are becoming activated and engulfing dendrites at a higher than normal rate.” If this hypothesis is confirmed, interventions that inhibit inflammatory signaling or activate synapses to make them more robust against engulfment “could give hope,” he said. 

Anxiety attracts attention

Daniel S. Pine, who heads the section on development and affective neuroscience in the National Institute of Mental Health Intramural Research Program, reiterated the need to understand mechanisms underlying disease. “A big problem facing the field is that we’re [classifying pathology] based purely on clinical signs and symptoms; we need to move past this to an approach based on what we know about the brain.”

Research should focus on “core psychological processes that we can evoke in the lab to leverage clinical insights,” he said.

Twenty years of research into anxiety indicate a key component to be “bottom-up” capture of attention by a sudden threat—much like the way one pulls a hand back from contact with a hot stove—too fast for awareness. A conscious component (realizing you pulled back because the stove was hot) comes after.

Repeated studies have shown that people with high anxiety have an “attention bias” toward threat—a heightened tendency to monitor potential danger in this way. One study, which tracked eye movement to gauge attention, found that infants who were more distracted by threats were likely to be more anxious ten years later. In a study of 1000 Israeli soldiers, those with a similar attention bias when they entered the military were at heightened risk of PTSD after combat.

In neural architecture, “the circuit connecting the amygdala to the prefrontal cortex underlies this abnormally brisk reflex,” Pine said. Understanding this component of anxiety might clarify some limitations of interventions like cognitive behavioral therapy (CBT). “If you want to change behavior people aren’t even aware of, you won’t do it by talking to them,” he said. “You need a different approach to change attention bias.”

He and other researchers developed an intervention, attention bias modification treatment (ABMT) that uses a computer game to practice redirecting attention away from a simulated threat (an angry face) “over and over, thousands of times.” More than 30 studies of this training have found greater impact on adult anxiety symptoms than control treatment using random stimuli.

A small study found that, by some measures, adding ABMT to CBT more effectively reduced children’s anxiety symptoms than CBT alone. Results from a larger trial, which also analyzed changes in brain function, should be available soon.

“The key idea is, by tying therapies more tightly to underlying perturbations in particular behaviors and brain circuits that give rise to them, we’re better able to tailor treatment to the individual patient,” Pine said.

Tracking the digitized brain

In the last presentation of the afternoon, Vaibhav Narayan, of Janssen Research & Development, described developments in mobile technology. While personal electronic devices that sense and track behavior and physiology are useful in other medical areas, “they will have a disproportionate impact in neuroscience,” he said.

“The brain is a highly complex organ with complex circuitry, but its entire biology expresses itself in a phenotypic layer of behavior, symptoms, and memory that is ripe for being digitized.” Mobile devices will untether indicators of disease onset and exacerbation from controlled settings, enabling more continual measurement, even at home, he said.

Narayan offered, as a simple example, a study tracking home computer use. Among people diagnosed with mild cognitive impairment, hours at the computer declined steeply month by month; among the cognitively stable, they stayed relatively constant.

“Something as simple as measuring time spent on a computer gives an idea of what’s going on in the brain. You can imagine a lot more sensitive measures, like interkey speed as the user types in a password.”

But can technology detect neurodegenerative brain changes before noticeable cognitive decline? “The only real way to answer the question is through data,” Narayan said.  Brain pathology is commonly reflected by declining ability to take medications, use the phone, and handle everyday finances. “You can use data-driven technology to measure these things,” he said.

Speech recognition technology in a smartphone can be tailored to measure verbal episodic memory, the domain that correlates most strongly with dementia. Embedding assessments of spatial episodic memory in map and GPS apps of people who agree to be tracked can generate massive data sets that may lead to validated tests to distinguish signs of pathology from normal declines in function.

In psychiatric disorders, change is considerably more dynamic. Bipolar symptoms, for example, may be stable over time, then worsen precipitously in a matter of days. This, Narayan said, suggests the possibility of prodromal intervention to pre-empt the onset of disease. In addition, “every relapse is another opportunity to detect, intervene, and improve the outcome,” he said. By tracking such parameters as physical activity, social interaction, and sleep, a smartphone app could pick up perturbations suggesting an imminent episode.

One ongoing study is following more than 300 people with major depression for a year, continuously collecting data on sleep, activity, and speech, in search of “a technology-driven signature of relapse that we can detect,” he said.

Narayan pointed out that dynamic data from devices, while sensitive to signs of illness, are not specific, and must be combined with clinical data for a meaningful biosignature.  “It’s not just change that’s predictive, but the rate of change,” he said. “You have to collect longitudinal data and be your own control.”

Along with the rapid proliferation of mobile devices have come quality control issues, he said. “The technology is moving a lot faster than the science… people are developing literally thousands of apps that claim to do things like monitor mood and offer therapeutic intervention, but the number that have any data behind them is small.”

“The most important thing we need to do is tie applications of digital technology to data, to evidence,” Narayan said. “We have to develop this ecosystem in a thoughtful manner.”