Altering Conscience?


by Carl Sherman

June 18, 2010

Summary: By studying people with brain damage—and healthy people whose moral judgments changed as they were exposed to a magnetic field—researchers are trying to trace out the neural basis for discriminating right from wrong.

Joe and Jack spot some mushrooms while hiking in the woods. Jack asks Joe, an experienced forager, if they’re safe to eat. Joe recognizes a highly poisonous variety that causes agonizing death in minutes. “Sure,” he says, “they look tasty.”

Jack grabs a handful. “Yum!” he exclaims, and soon they’re back on the trail. It turns out Joe was wrong about the mushrooms.

Did Joe do a bad thing?

Most people would think so—bad intent make bad behavior, regardless of outcome. The law agrees, says Walter Sinnott-Armstrong, professor of ethics at Duke University and co-director of the MacArthur Foundation’s Law and Neuroscience Project. “We even put people in prison for attempting to cause harm, for acts that didn’t actually hurt anyone,” he says.

But people with damage to a brain area called the ventromedial prefrontal cortex (VMPC) see things differently. “No harm, no foul,” seems to be their moral philosophy. Working with people with such lesions and with experimentally induced alterations in brain function, neuroscientists are gaining insight into the neural basis of moral judgment—and raising provocative questions about human nature itself. 

 In a study reported in the March issue of Neuron, researchers led by neuroscientist Liane Young of Massachusetts Institute of Technology tested nine people who had sustained bilateral damage to the VMPC and controls who were normal or had damage to other brain areas, comparing their reactions to a series of scenarios that mixed intentions and outcomes.  

  In another version of the mushroom story, for example, Joe believes the mushrooms are harmless—but they’re not, and Jack dies. In other vignettes, intention and outcome match. 

While intentions mattered most to the controls (they judged a failed attempt to cause harm more blameworthy than a deadly accident), for VMPC patients it was the reverse: Only the outcome counted, regardless of what the protagonist believed or intended.

  “The VMPC is involved in processing social emotions like empathy, embarrassment, and guilt,” Young says. “If you show emotionally evocative pictures like war scenes or mutilated bodies to patients with damage to this region, they won’t show the normal response.”

People with VMPC damage have no deficit in reasoning, Young says. “What they can’t do is emotionally integrate information about intention when making a moral judgment.”

Her findings are somewhat counterintuitive, she says. “One usually highlights the role of reflective, rational faculties in morality. This study shows there’s an important role for emotions.”     

But the VMPC is only one brain area that participates in moral calculations. In another study, reported in the February Proceedings of the National Academy of Science, Young and colleagues turned their attention to the right temporoparietal junction (RTPJ). “It used to be thought that this region was broadly involved in desires and attitudes, but as techniques got better, we narrowed its function down to processing what are called representational mental states,” she says.

Brain imaging studies show the RTPJ is engaged when we read about or imagine what’s going on in other peoples’ minds. To determine whether we need it to judge their actions, Young and her colleagues used transcranial magnetic stimulation (TMS)—a technique that induces powerful electromagnetic fields within the brain. TMS (which is also applied therapeutically for depression) can be targeted to briefly disrupt brain activity in precisely defined areas. 

The researchers had healthy volunteers pass judgment on a series of vignettes like those in the VMPC study, normally and when their RTPJ was subjected to magnetic interference. As the scientists had predicted, impairing RTPJ function skewed moral judgment: The protagonist’s intention didn’t cease to matter (attempted mischief was still judged worse than an honest, if serious mistake), but counted about 15 percent less than it normally did.  

In one phase of the study, the researchers timed TMS for the moment when the full scenario had been presented and the subject was asked to pass judgment. “This suggests to us that it’s the integration of belief into the judgment that’s disrupted,” Young says. “Even though they’ve read the story and understand it, they can’t process this information.”

“Liane Young is a leader in the study of the role of intentions and beliefs in moral judgment,” says Sinnott-Armstrong. Earlier research connected these brain regions to such functions, “but these papers get beyond correlation to causation,” he says. “They represent a big step forward.”

The use of TMS was particularly novel and striking, he says. “I think people were shocked you could change people’s moral judgment with [a magnetic field].”

Young thinks that the RTPJ and VMPC are part of a broader circuit activated when we evaluate others’ actions. In another study, she and Sinnott-Armstrong were part of a research team led by Michael Miller of University of California, Santa Barbara, that sketched in more of the network.

The experiment, reported in the June Neuropsychologia, involved six people who had had full or partial callosotomy—brain surgery to disconnect brain hemispheres—as treatment for intractable epilepsy. The participants were asked to assess the same vignettes as in earlier studies, and responded similarly, failing to appreciate the importance of intention the way the controls did.    

The deficit, the researchers say, shows the moral necessity of cross-hemispheric integration that links the RTPJ with the left frontal cortex, an area active in judgment. “But there’s a mystery here: why the partial split shows the same effect as the full split,” says Sinnott-Armstrong. It may be that waves of activity must travel elsewhere through the right hemisphere before crossing over; he suggests that brain imaging studies such as diffusion tensor imaging might help delineate these pathways. 

Studies like these could have implications for the legal system, says Sinnott-Armstrong. Understanding how we judge others’ actions “could help make guilt determination and sentencing decisions more reliable.” In particular, spelling out the neural basis for discriminating right and wrong “might help us better determine when people lack the capacity necessary for moral responsibility,” he says.

  But whether such research will reveal a master neural conscience is doubtful, he says. “I believe different types of moral judgments activate different parts of the brain.” 

Joshua Greene, a psychologist and neuroscientist at Harvard University, agrees. “We   sometimes think of a moral sense as if it’s a unified faculty in the mind, that there’s a part of the brain specifically dedicated to moral judgment, and I think the evidence is strongly against that. Moral judgment depends on many different systems that sometimes compete against each other. Some play the same role in contexts that have nothing to do with morality,” he says.

“Neuroscience studies are powerful because of what they teach us about psychology,” says Greene. “As a philosopher, I find most interesting their broader implications, how they might change the way we think about ourselves.”