Our experiences in the world shape our brains. This is abundantly clear in the study of the deaf brain: Loss of an important sensory input leads to key changes to both the structure and connectivity of the cortex. Yet those observed changes, primarily in the temporal lobe areas involved with auditory processing and speech, may not apply to all non-hearing persons. Research from the Georgetown University Medical Center illustrates how that the language that a deaf child learns first, English or American Sign Language (ASL), is linked to structural differences in the brain's language areas.
The deaf brain
According to the National Institute on Deafness and Other Communication Disorders (NIDCD), approximately 2-3 of every 1000 children born in the United States are deaf or hard-of-hearing. The deprivation of sound and language result in key changes to auditory cortex, in particular the brain systems involved with sound and speech processing. Those structures in auditory cortex are markedly smaller in volume in non-hearing populations, and smaller volumes have been observed in the brain's temporal and frontal speech centers, too.
"There are quite a few differences in the brains of people who are deaf," says Nina Kraus, director of the Auditory Neuroscience Laboratory at Northwestern University. "They are often starting out with a different brain right away. If we can understand these differences, the differences that were there initially or the differences that came based on experience, it may give us the right information to effect strong educational and learning strategies in the future."
Guinevere Eden, director for the Center for the Study of Learning at Georgetown University Medical Center, wondered how those brain changes might change the way deaf people learn to read. But as she perused the scientific literature on the topic, she noticed that most studies concentrated on deaf people who had learned ASL as their first language. But, as the NIDCD notes, 90 percent of deaf babies are born to hearing parents, and likely learn English through lip reading before they learn ASL. So Eden wondered if an important group of deaf learners might have been overlooked.
"The nature of ASL, because of its special components, can influence brain function quite profoundly. So there's good reason to believe it should also impact brain anatomy," she says. "But most people who are deaf are born to hearing people. They may not have access to ASL at an early age. So we began to realize that there were really two groups of deaf learners here and only one had been studied-and by looking at both, we had the opportunity to get a better understanding of the role different language learning experiences may play in brain structure." (Eden received a grant from Dana Foundation in 1996 to do research on dyslexia.)
Different language, different brains
Eden, with post-doctoral fellow Olumide Olulade, compared the brain structures of deaf people with hearing controls. Half of the deaf participants learned English as their first language, the other half ASL. As language and auditory processing areas lay quite close together in the temporal lobe, the researchers took care to look at each separately during the analysis so they might better understand which differences may be due to hearing loss and which might be linked to the first language learned.
Like previous studies of deaf participants, Eden and colleagues found that all the deaf participants had a smaller volume of white matter in auditory cortex than hearing participants. However, the group found key differences in language areas, and those differences were specific to which language was learned first. Those who learned ASL first showed greater anatomical differences than hearing controls-with smaller white matter volume in the left superior temporal region and the interior frontal region. The results were published in the April 16 Journal of Neuroscience.
"What this study is showing is that things that apply to one deaf student or one deaf person may not apply to another," says Olulade. "The differences we see in experience, not just in their sensory experience but also in their language learning experience and many other experiences on top of that, these all play a role in shaping brain structure and function. And that, in turn, shapes learning."
Experience and learning
Kraus applauds this study as a first look at the question-and while she says the results did not surprise her, she thinks they are an important addition to our understanding of language and learning.
"These two languages do have some things in common. Both are visual, one has to read the signs or the lips to understand the language. But they are quite different. ASL is a spatio-motor kind of language, and English has some motor learning involved but is much more about lip reading and phonemes. ASL has its own grammar," she says. "And this study shows key differences in the areas that underlie linguistic skills and phonological processing. And that makes sense-because these two languages are quite different. So seeing these distinctions, emphasizing a particular language's linguistic structure, dovetails nicely with what we know about language, learning and the brain."
Both Kraus and Eden suggest that we look at these kinds of findings more broadly, as they have the power to help us better understand how we learn.
"If we can understand the biological circuits involved with learning, if we can see which aspects are enhanced, diminished, or even just changed when we learn a language, we gain more insight into how that learning happens in the first place," says Kraus. "And that's important information to have, whether someone is hearing impaired or not."