Out of Left Field

New Insights into Where Language Understanding Resides in the Brain
user
Author:
Kayt Sukel
Published:
June 28, 2019

More than 150 years ago, Paul Broca, a French physician, turned common wisdom regarding brain function on its head when he presented a unique patient case to the Société d’Anthropologie de Paris, a leading scientific society. At the time, many believed that the brain produced thought, emotion, and action in a holistic manner—that is, the entire organ was required for any and all of these functions. But after Broca described the autopsy results of an epileptic patient named Monsieur Leborgne, who, while still living, was only able to produce the word “tan,” he argued that speech function was localized to an area in the left frontal lobe. First year psychology students now know this small region as “Broca’s area,” or the cortex’s speech center, and that patient by the only word he was able to utter, “Tan.” In the decades to follow, other patient cases and neuroscientific investigations suggested that language, from speech production to grammatical understanding, largely resided in the left hemisphere of the brain. But now, a new study from New York University School of Medicine suggests that language processing may be less lateralized than once thought.

To the left, to the left

Clinical observations after brain damage, from Broca’s time and in the years since, consistently show that damage to the left side of the brain can lead to language deficiencies like aphasia, or the loss of ability to understand or produce speech. That is why most people associate language with the left hemisphere of the brain. But Anthony Dick, Ph.D., director of the Developmental Cognitive Neuroscience Laboratory at Florida International University, says that understanding is an oversimplification.

“Certainly, language is specialized to the left side the brain, but newer work has shown that it also incorporates areas in the right hemisphere,” he said. “But we continue to talk about it as if it’s a left-hemisphere thing because of stories like Broca and Tan. The truth is that the brain is hard to understand even for people who study it for a living—but we need to think beyond these simple stories if we really want to understand how the brain does what it does.”

That’s why Adeen Flinker, Ph.D., an assistant professor in the department of Neurology at New York University School of Medicine, wanted to study hemispheric asymmetry in language to understand how the brain processes—and makes sense of—speech. He said that while the brain is extremely symmetrical—and modern studies support the idea that both hemispheres help us read, speak, and write language—it’s not well understood what each side may be contributing to these complex processes.

“This is still a fundamental question in human neuroscience that has been unanswered for hundreds of years,” he said. “We wanted to explore the division of labor between left and right hemispheres in language processing.”

In one ear and out the other

Flinker and colleagues, including neurosurgeon Ashesh Mehta, M.D., from the Donald and Barbara Zucker School of Medicine at Hofstra, used a dichotic listening task, where recordings of speech were presented to one ear or the other—information heard in the left ear will be initially processed by the right hemisphere, while information heard in the right ear will be perceived on the left side of the brain. They manipulated the speech, changing both its temporal, or timing aspects, and spectral, or frequency effects like pitch, rhythm, and melody, to see how well it was perceived on either side.

“Language isn’t just the words that you are hearing. It’s also the way that someone is saying those words,” said Mehta. “There’s that old saying, ‘It’s not what you said but how you said it,’ and the temporal and spectral aspects are important to that how. You know that someone may be angry, or asking you a question, by the timing or the pitch they use when speaking to you.”

The researchers tested people with normal hearing on the dichotic listening task while they measured brain activity using magnetoencephalography (MEG). In addition, they used electrocorticography (ECoG), recording directly from the brain, to measure brain activity in epileptic patients who were undergoing neurosurgery. They found that two hemispheres had two distinct yet overlapping roles in understanding speech. The right ear, or left hemisphere, showed an advantage in noticing how sounds change over time, while the left ear, or right hemisphere, was more on point in alterations with pitch and frequency. The results were published in the March 4, 2019, issue of Nature Human Behavior.

Flinker said, beyond these hemispheric differences, their results also showed a much wider network of activation than originally expected.

“I predicted that the correlations that we saw between behavior and neural signal would be strong in auditory cortex, but we saw them all over the brain,” he said. “This language network is probably modulated by the features of the sound, like how I’m speaking and the words I’m saying, but also by you trying to pay attention to what I’m saying and make sense of what I’m saying. There’s a really large system that we saw engaged.”

Moving forward

Flinker hopes to continue this work in order to better understand what other language and speech features might drive hemispheric asymmetries in the brain. He thinks these subtle differences can help us better understand language networks and how they may be compromised by brain damage. In doing so, we may be able to come up with better interventions to help people who develop aphasia after a stroke or other injury.

For his part, Dick hopes that newer studies looking at language and the brain can help neuroscientists tell new stories about language and the brain—ones that aren’t so stuck in the past (or the left hemisphere).

“The reasons so many of the old models of language work so well is because you can talk about a story like Broca in an introductory psychology class and explain it in a way that people can understand,” he said. “What we need—beyond more studies looking at how the brain makes sense of language—is a way to come up with stories that can capture the complexity of new findings but still make sense to people who aren’t neurobiologists.”