Brain Actively Alters What We See


by Aalok Mehta

May 4, 2009

We don’t just see through colored glasses, we tweak the color of those spectacles to suit our expectations, suggests a recent study of the visual system.

For years, neuroscientists have wondered about the differences among people when they make decisions about seemingly simple visual data, such as whether an object is close or far away. One possible explanation is simple random variation, as the brain absorbs ambiguous data and sifts through it for what’s important—what researchers refer to as a “bottom-up” or causal explanation. Or the differing decisions may reflect some active, “top-down” intervention from the brain, based on what it expects to see or the need to reinforce an already made decision.

By tracking neural activity in macaque monkeys, researchers have found evidence for the latter. Instead of simply interpreting sensory data into a coherent image, once the brain makes a decision it seems to send signals back down the line to alter the sensitivity and response of neurons involved in processing early visual data—literally changing how we see.

“What we find is that a causal explanation is insufficient,” says study leader Hendrikje Nienborg, a postdoctoral fellow at the National Eye Institute in Bethesda, Md. “A significant component of decision making reflects a top-down effect on sensory neurons.”

The distinction may seem subtle, but it offers some empirical data to an ages-old debate about just how large a difference exists between what we perceive and what actually exists, as well as how our brain might affect that gap.

And it suggests that top-down mechanisms play a deeper part in brain processes than previously thought, occurring even in basic visual processing. Attentional control—in which we actively decide to focus on a specific item—is another such case, but it occurs at a much higher, more abstract level of thinking.

“It’s generally been thought that the earlier you are in [a neural] pathway, the less the influence of a top-down effect,” says Joshua Gold, an assistant professor of neuroscience at the University of Pennsylvania who was not involved in the research. In other words, such feedback from the brain wasn’t expected during the initial stages of sensory processing but much later on. “It’s now not clear just how true that simplistic model is,” Gold adds.

Measuring single neurons

The experiment, which was published March 8 in the journal Nature, involved looking at how the monkeys judged the distance of an object that vacillated between appearing close to the animals and farther away. Simultaneously, Nienborg and her colleague Bruce Cumming tracked the activity of individual neurons in the V2 area, one of the first regions in the brain to receive visual data.

By looking at when the monkeys made their decisions versus how strong the evidence for that choice was, the researchers found several threads of evidence weighing against the causal model. And they also found a physiological mechanism to explain the top-down effect, noting that the gain of V2 neurons—essentially, how large a signal the cells passed on—varied in tune with the monkey’s choices.

The researchers aren’t sure why this change in neuronal gain occurs. The brain may be trying to boost information from cells to support a decision it has already made. Or the brain may have formed a bias about what kind of object is important to notice—a nearby one, for instance, because of the possible danger—and attempts to increase signals tuned to that information.

Regardless of the reason, one potential application for this finding is in hearing and seeing aids, says Nienborg. “What people are trying to do with prosthetic devices is to provide raw information in one direction,” she says. “But if that information is being modified, that’s something to think about when designing these devices.”

She suggests that this effect may also be one explanation for why our brains tend to contain clusters of neural cells that respond to similar visual signals. “It makes sense if you want to send top-down information to the neurons,” Nienborg says. “You don’t have to intersperse connections among many different areas, just send to one cluster. … It could represent an ongoing principle for wiring efficiency.”

That conclusion about the brain’s organization makes sense but is highly speculative, says Michael Shadlen, a professor of physiology and biophysics at the University of Washington School of Medicine in Seattle who also studies decision making and vision. “I can think of many other reasons why this may be the case,” he says. “Rats don’t have the same organization, and yet they can see.”

But otherwise, he says, the paper makes a convincing case that top-down mechanisms are important in vision and sets the stage for future investigations into just how complex the neuroscience behind our senses really is.