Brain Responds Quickly to Faces


by Kayt Sukel

November, 2008

Approaching the 200th anniversary of Charles Darwin’s birth (in January 2009), the bulk of studies concerning facial expressions support his premise that facial expressions for emotions such as happiness, sadness and anger are universal across races and cultures. With new methods in cognitive psychology and neuroscience, researchers are extending our understanding of just what facial expressions convey and how we interpret them.

 Using Faces to Judge Intention

Studies over many years have revealed that human beings make rapid-fire judgments about others based on the look on their faces.

“We know from carefully controlled laboratory studies that we form opinions very rapidly and dramatically based on appearance,” says Alexander Todorov, an assistant professor of psychology and public affairs at Princeton University. “In only about 100 milliseconds, a person will determine whether they like or don’t like another person based on their face.”

In a study published in the August 12 issue of the Proceedings of the National Academy of Sciences, Todorov and Nicholas Oosterhof set out to uncover what parts of the face explain how we make judgments about who should be trusted or feared. The two determined that two traits in faces with neutral expressions seem to be critical: valence, or whether the face looks to be a positive or negative one; and dominance, or whether the person looks weak or strong. The two then used a computer model to figure out what exactly makes a face appear positive or negative, weak or strong, and discovered subtle hints behind even neutral faces.

“We found that when we started to exaggerate the features of a neutral face that people think is trustworthy, we ended up with a happy or smiley face. With a negative face, we ended up with an angry expression,” Todorov says. “It’s interesting because it’s these emotions that signal to you whether you should approach or avoid a person.” Todorov and Oosterhof plan to use the data from this study to construct new models of how specific facial structures are linked to emotional judgments.

The research builds on earlier work in which Todorov and colleagues examined whether these immediate judgments had any influence in a real-world scenario such as an election. The group collected photos of the winners and runners-up from Senate, House and gubernatorial races across the country and simply asked study participants to select which of the two looked more competent. As reported in the June 10, 2005, issue of Science, the candidate whose face was deemed more competent in that split-second decision was the one who carried the election about 70 percent of the time.

The Importance of Context

 The quick judgments humans make about faces can be revised with additional information, researchers have found.

“We’ve seen that both personality as well as external context can strongly influence social perceptions,” says Pascal Vrticka (pronounced VER-titch-kuh), a Ph.D. student at the University of Geneva. Vrticka and colleagues set out to determine how by measuring blood flow to the brain using functional magnetic resonance imaging (fMRI) during a motivating game task. After each round of the game, participants were shown a word and a face. Participants were led to believe that the faces were individuals who might benefit from their performance. The word described their performance and the face was either happy or angry, providing two congruent (win/happy, lose/angry) and two incongruent (win/angry, lose/happy) conditions. After the game was complete, participants completed a personality questionnaire that measured their attachment style, or how they associate with others in social relationships.

In the August 6 issue of Public Library of Science ONE, Vrticka’s group reported that even visually identical facial expressions result in different patterns of brain activation, dependent on the social context. For the combination of a happy face and a winning score, the researchers found that the ventral striatum and ventral tegmental areas, regions associated with reward processing, were activated. But in negative conditions, the group found increased activation in the left side of the amygdala. Furthermore, the activation varied based on a person’s attachment style, with avoidant personality types showing less reward-area activation in the positive condition and anxiously attached persons showing more amygdala activation in the negative condition.

“We found that participants with an avoidant personality type were much less sensitive to social rewards in our game,” says Vrticka. “For them, personal success was more important than praise from others. On the other hand, because anxiously attached people generally worry about being rejected by others, they are thought to be more sensitive to negative social cues. And this notion of increased vigilance to signs of social punishment corresponds to the activation in the amygdala that we found.”

Turning Off the Ability to Read Expressions

Historically, faces have been considered a special category of objects. Several neuroimaging studies have pegged the fusiform gyrus in the temporal lobe, and the occipital gyrus, as areas of special interest for face recognition. But to date, it has been difficult to tease out the significance of activation in these areas.

David Pitcher, a Ph.D. student at University College London, used repetitive transcranial magnetic stimulation (rTMS), an electromagnetic pulse that can disrupt neural processing, at the right occipital face area (rOFA) and an area correlated with faces in the right somatosensory cortex to see if it would interfere with face recognition and facial expression discrimination.

The group reported in the Sept. 3 Journal of Neuroscience that rTMS impaired the participants’ ability to determine if two faces had the same expression. However, the researchers found that it had no effect on participants being able to distinguish whether two faces were the same person. Pitcher argues that since the rTMS to the somatosensory cortex interfered with the discrimination of facial expressions, nonvisual cortical areas are also involved in their processing.

“When you look at when the rTMS was delivered, it’s almost tracking that information going from the visual to the non-visual areas of the brain,” he says. “And it’s all happening very early. The somatosensory area of the brain is reacting milliseconds after you see the stimuli.”

 Updating a Representation

Although scientists are learning more about how humans interpret facial expressions, much of that work is limited to the milliseconds it takes to make that initial judgment.

“We have a fairly good understanding of the basic evaluation process,” says Todorov. “But these early judgments are easily overwritten when new information comes in.”

Todorov says there is still much to understand. His future work includes studying how the brain updates these early judgments – and what impact those updates may have on interactions with the rest of the world – as well as the neural circuitry underlying the changing representation.

“Facial expressions are very important. They are essential for social communication,” he says. “But we still have a long way to go to figure it all out.”