Share This Page
As we interact with our environment in the course of our daily lives, most of us take for granted the smoothness with which the flow of events blends together. Our brains stitch together what might otherwise seem to be a disjointed sequence of individual events into a coherent stream of information about the world around us. Language is an obvious example, of course, as letters are bound together into words, words are grouped into phrases and sentences, and sentences form paragraphs and discourse. But music, too, relies on our ability to bind together smaller elements into Musical behaviors include the same elements of perception, action, emotion, and other mental operations as so many other kinds of behavior. larger structures. Individual notes are strung together to form melodic motifs. Melodic motifs are used to shape phrases, which in turn form songs or even movements of symphonies. Notes sung simultaneously by different voices or played by different instruments are combined to create harmonies. How our brains bind together these discrete pieces of auditory information to create the experiences of hearing, remembering, or performing music is at the heart of a recent surge of neuroscience research aimed at understanding this ubiquitous human phenomenon.
Despite music’s central role in human cultures around the world, and its potential to help unlock the mechanistic secrets of the brain, its arrival on the scientific scene is rather recent. Nonetheless, exploring music’s basis in the brain can help shed light on a remarkable human activity that has been a part of our social and cultural fabric for millennia. Moreover, while the relegation of music to scientific second fiddle is understandable, we should not minimize the role that music can play in our broader understanding of how the brain works.
Translating a Musical Score into Action
Understanding how the brain accomplishes music is likely to enhance our understanding of the brain’s inner workings for the simple reason that musical behaviors include the same elements of perception, action, emotion, and other mental operations as so many other kinds of behavior. To get a handle on how music might be processed by the brain, scientists use two primary methods: case studies of people who have suffered some form of brain damage and neuroimaging studies to measure physiological changes in the brains of healthy people while they perform various tasks related to music, such as remembering short melodies or reproducing rhythms. Studying people with an injury to the brain provides a particularly fascinating window into the workings of the brain because it helps researchers understand general principles of brain organization while taking into account variations in individual cases.
Following damage to the brain, such as the death of tissue after a stroke, a person is likely to experience deficits—whether in perception, movement, attention, or memory—that make performing previously routine behaviors and tasks more difficult. If neurologists carefully identify these deficits, they can pinpoint the specific mental operations that are impaired. When multiple patients who have damage to the same brain area experience the same deficits, the neurologist can ascribe the underlying mental operations to specific brain areas. For example, damage to the left side of the frontal lobe of the brain in a region known as Broca’s area results in Broca’s aphasia, the inability to produce sequences of speech sounds and words.
Closer scrutiny of individual patients presents a more intriguing picture, however. Often some highly specific functions are lost but others are spared, despite the brain damage being widespread or occurring in a region commonly associated with a multitude of functions. My launching point for exploring music in the brain in this article is the remarkable personal account of one such patient, Ian McDonald, M.D.1
McDonald was a British neurologist with an avocation as a classical pianist. He was a skilled musician who spent much time playing the piano, both alone and in small chamber music groups. But as a result of a stroke in 2004, he lost his ability to read and play music from a score, as well as to appreciate it on an emotional level. Fortunately, he documented both his symptoms and his recovery. His report, taken together with other similar case studies, provides insight into the role that the brain’s parietal lobes play in transforming information from one form into another and in binding together the stream of events into a continuum of meaningful experience.
Initially, McDonald did not realize he had experienced a stroke, but after becoming aware of various difficulties with previously routine tasks, he sought a medical evaluation. The resulting exam and tests showed that the stroke affected a relatively circumscribed area of his brain, two folds in the surface of his right hemisphere that are called the right supramarginal and angular gyri. These folds are part of the parietal cortex, as shown in the figure below.
On the mechanistic level, McDonald reported considerable difficulty in reading a musical score. He suffered from “musical alexia”—an inability to read music—that was so pronounced he had trouble reading even simple melodies. He made clef errors, meaning that he interpreted music written in the bass (bottom) clef—generally played by the left hand—as being written in the treble (top) clef, generally played by the right hand. He also had a hard time reading the notes written above or below the staff, the set of five lines on which notes are marked. Fluent reading of a musical score requires more than reading individual notes on a page; it also requires translating the notes into actions. So it is interesting that McDonald also had problems moving his hands to the appropriate places on the keyboard, especially when larger jumps across many piano keys were involved.
Remarkably, even though he was unable to read the score fluently, he was able to give letter names to the notes in the score (A, B, C, and so on) and then use that information to pick out single notes of a melody on the keyboard, one at a time. Thus, it wasn’t that the score made absolutely no sense to him; rather it was that he couldn’t read it in the way to which he was accustomed, by translating the notes directly into action without having to think of their names. What might a neuroscientist see in McDonald’s ability to translate the information in a musical score into appropriate key presses on the piano using the more circuitous and tedious process of verbal labeling?
To understand the implications, we must first understand what cognitive neuroscientists mean by the term “representation.” This word simply reflects the idea that things in the external world exist separately in the external world and in our minds. In the case of reading music, physical information about a written musical note (a black oval with a vertical line extending from the side of it) can be present in the brain at multiple levels of abstraction. These mental representations are embodied in patterns of activity among groups of neurons in different parts of the brain. For example, neurons at early stages of processing in the visual cortex will respond to the physical properties of the note—the size of the black oval and the line—thus representing those physical aspects. But a higher stage of processing is required to interpret (and therefore represent) the particular combination of physical features that make up the note.
So McDonald’s ability after his stroke to translate notes on a page via a different method suggests that information about a musical score can be represented in different ways in multiple brain regions. It also indicates that these representations can be combined flexibly to meet different goals, for example, naming the notes on a page versus playing them with a desired phrasing.
If a musician wants to spell out the notes that form a chord or a melody, for example “C-E-G,” the visual image of each note on the page must be bound together with the verbal label that corresponds to that note. Associating a note on a specific line, or space between two lines, with a letter name isn’t enough, because the same line in the treble (top) or bass (bottom) clef corresponds to a different note. The problem is further compounded by the key signature—the sharps and flats—which typically appears only at the beginning of the staff, next to the clef sign. All of these pieces of information together provide the context that shapes how the brain translates a circle appearing on a line—a note—into a letter label. This context must be held in the musician’s working memory, a short-term buffer that the brain uses to maintain and compare pieces of information.
Verbal labeling takes time, but musicians usually translate the visual image into movements more quickly. The same pieces of visual and contextual information that go into verbal labeling become bound instead to action plans that move the hand and fingers to appropriate locations on the piano or other instrument. Because it was this ability to translate a musical score into action that was damaged after McDonald’s stroke, and because the stroke affected the angular and supramarginal gyri in his right parietal hemisphere, we can logically suspect that these parts of the brain are involved in binding together the mental representations of the notes and the actions.
Researchers have learned recently that corresponding regions in the left hemisphere are also involved in reading music. But damage to that hemisphere has the opposite effect than the one that resulted from McDonald’s stroke. Daniele Schön, Ph.D., and his colleagues at the universities of Trieste and Padova studied a professional musician who had a stroke that damaged her left temporo-parietal area.2 In direct contrast to McDonald—who could name notes but not read and play them—she was unable to name the notes of a melody written in the bass clef, even though she could read the melody from the score flawlessly when playing it on the piano. Similarly, she could play notes whose names were spoken to her, but she had pronounced difficulty in writing those same notes. So in this case, the ability to bind note names with their visual symbols was impaired.
A similar failure of the note-naming system was evident in another patient described by Lisa Cipolotti, Ph.D., and her colleagues at University College, London.3 This patient had left hemisphere damage and considerable language impairments (aphasia), but virtually no musical impairments with the exception of the inability to name written musical notes or produce a note from a verbal label In this particular case, the patient had trouble reading letter names and comprehending letter sounds, regardless of whether they were part of words or notes in a score, illustrating how a common mental representation of letters is likely involved in both language and music.
Neuroimaging the Healthy Brain
These observations in people who have experienced a brain injury are complemented by observations that were made in neuroimaging studies of people without any brain damage while they read music.4, 5 These studies, taken together with the location of McDonald’s brain lesion from the stroke and his impaired score-reading, strongly implicate the right parietal cortex as a nexus in the multifaceted process of reading a musical score.
In order to isolate those regions involved in a particular mental process, neuroimaging studies carefully compare two experimental tasks that differ in only one critical aspect. In one key study, Justine Sergent, Ph.D., and her colleagues at the Montreal Neurological Institute used a series of tasks to identify brain regions involved in the mental processes associated with reading and playing melodies. To identify what parts of the brain are involved in playing simple musical scales, Sergent and her team subtracted the activity in the brain of a person listening to musical scales being played by someone else from the activity when the same person both plays a scale and hears it played. The main difference between these two conditions was the act of playing, because in both cases the subjects heard the scale.
Next, to get at the more complex processes associated with reading a musical score, they compared brain activity caused by reading a score with that caused by viewing and responding to a simple pattern of dots on a screen. The idea was to figure out where in the brain a musical score starts to assume meaning, once the lower-level perceptual processing of the notes as simple dots has been accomplished. They also compared the brain activity elicited by both reading and hearing a musical score with that involved in simply reading the musical score, so they could identify those brain areas responsible for hearing the music.
As a result of these types of comparisons, Sergent and her colleagues learned that the visual processes involved in reading a musical score engage parietal regions at the border of the parietal and occipital cortices, while the auditory processes engage the supramarginal gyrus. Following up with even more specifically targeted comparisons helped them to identify areas that bind together representations of the sensory information—for example joining the position of a note on a staff, or the letter name for that note, with the appropriate actions, such as the hand or finger movements necessary for playing the note. As a result, they determined that parts of the superior parietal lobule, along with a broader cortical network of motor areas, are involved in these interactions.
More recently, Daniele Schön and his team were interested in finding what brain areas specifically represented the mapping between reading a musical score and the musical action. They sought to isolate the specific areas involved in directly converting a score into movements, using an experiment that compared playing in response to letter names versus playing in response to Arabic numerals that were used to identify the different piano keys. Their study identified specific sites in the right superior parietal lobule and the inferior parietal sulcus (a region that forms the upper border along the angular and supramarginal gyri) that were more active specifically during score reading.
Emotion and Time
As mentioned earlier, McDonald’s piano playing was profoundly affected not just in terms of mechanically reading and playing the piano but on an emotional level as well. He observed, “I seem to have lost the ability to follow from one chord to the next. The music did not seem to ‘make sense,’ that is, there did not seem to be the necessary or implied connection between one chord and the next of which anyone familiar with this music is aware.” Commenting further on this lack of cohesion between successive chords or notes, which made it difficult to discern the emotional intent of the composition, he wrote, “The succession of notes, even when correct as a result of having designated them by letters, made no musical sense: the emotional content the sequence of notes should convey, which determines phrasing and expression, was completely absent.”
These statements by McDonald provide two intriguing insights into the functions of the brain regions that were damaged. First, they suggest that these areas not only bind together features of an object or event at a single point in time—for example, binding the image of a note with its letter name—but also bind information over a period of time. The angular gyrus appears to be part of a network that is more active when a person’s thoughts are directed inward, as when evaluating how one feels about something, or when forming larger-scale action plans. It might, therefore, be critical in giving music its emotional meaning. such as connecting a series of notes into a coherent musical phrase. This makes sense because numerous studies have found that these areas of the brain are activated during tasks that require maintaining a sequence of items in working memory. Keeping contextual information in mind, such as which clef the notes on the staff belong to or which key they are in, also depends on working memory.
More intriguing is the idea that this area of the parietal cortex is somehow involved in an emotional response that arises from the cohesiveness of a series of musical notes and phrases. This suggestion is surprising because emotion is typically associated with other parts of the brain, such as the limbic system or parts of the prefrontal cortex. Thus, one has to find a way to reconcile the more mechanistic functions of the parietal cortex—for example, sensorimotor binding and working memory—with the more abstract and ephemeral experiences of emotion and meaning.
Ultimately, all behaviors and their emotional manifestations depend on coordinated activity in multiple brain areas, so a possible solution to this conundrum presents itself if we consider the role of parietal cortex in the context of the broader brain networks of which it is a part. McDonald’s observations suggest that part of the parietal cortex might be a gateway through which the emotional systems of the brain find out whether a sequence of events and actions merges into a coherent experience. The work of Marcus Raichle, M.D., and his colleagues at Washington University in St. Louis shows that the supramarginal and angular gyri belong to reciprocally active networks.6 The supramarginal gyrus appears to be part of a network that supports directing attention outward to objects and events in our environment, maintaining information in working memory, making decisions, and implementing actions. In contrast, the angular gyrus appears to be part of a network that is more active when a person’s thoughts are directed inward, as when evaluating how one feels about something, or when forming larger-scale action plans. It might, therefore, be critical in giving music its emotional meaning.
The Multiple Roles of the Parietal Cortex
Cognitive deficits arising from brain damage caused by stroke rarely affect only a single mental process or ability such as language, semantic knowledge, memory, or attention. The musical deficits in the neurological cases discussed earlier were generally part of a broader spectrum of compromised abilities. Although his language skills were largely unaffected, McDonald experienced not only musical impairments, but also a variety of other difficulties that required the combination of spatial, visual, and temporal information or the comparison of information in working memory. For example, he became unable to navigate in a foreign city using a map, and he had difficulty crossing the road because of problems with the spatiotemporal abilities necessary to judge the direction and speed at which cars were traveling. He also suffered from dyscalculia—an inability to do mathematical calculations in his head—when he tried to convert the value of one currency into another. Like binding together notes into a musical phrase, such skills also require maintaining and combining multiple pieces of information across time and are known to involve the parietal cortex.
Some of the other people with musical alexia who have been studied by researchers also experienced other deficits, typically in their language abilities, as well as in other musical skills in addition to reading a score. In one case, the patient’s ability to reproduce a rhythm or tap along with one was impaired, but her ability to recognize familiar melodies and discriminate between melodies was preserved. In contrast to McDonald’s right hemisphere lesion, the lesion in her case was on the left side of the brain, spanning the area from the auditory cortex to the angular gyrus, perhaps explaining why she had greater language difficulties.
Although in this article I have focused on only a few cases of musically talented people with brain damage, the patterns of deficits that result from damage to the same general brain region are striking. On the one hand, certain common principles emerge. For instance, information is bound across time and perceptions are bound to actions. This suggests that the parietal cortex plays a critical role in binding together different kinds of mental representations. However, the patterns of highly specific deficits are not consistent, which raises an interesting question about the modularity of functions in the brain and whether separate modules for music and language exist. Cases in which language is impaired (aphasia) but musical ability is not, and vice versa, support a modular view. But perhaps such dissociations between the two abilities are, instead, simply an idiosyncratic consequence of a person’s individual life experiences. In other words, to what extent is the organization of the parietal cortex a consequence of a lifetime of experience and particular skills, each of which requires putting different pieces of information together in different ways? To the extent that various people become expert in different behaviors, or implement the same behaviors and skills in different ways, the details of their brains’ parietal topography may also vary.
Let’s imagine for a moment that, rather than playing the piano by reading a musical score, McDonald routinely improvised and played by ear. Or imagine that he made music proficiently in all these ways. Would the stroke have affected these music-making abilities equally, or would he, nonetheless, have experienced only musical alexia? My guess is that the music still would have stopped making sense to him and would have lost its emotional richness, even if he was playing by ear. Although this type of deficit seems qualitatively different from the inability to put together specific pieces of information, such as the positions of notes on a staff with the appropriate finger movements, it does suggest that cohesion among events is an important component of our emotional responses to those events.
One way to try to dissociate the emotional components from the more mechanistic aspects of binding information might be research using transcranial magnetic stimulation. This technique employs pulsing strong magnetic fields above specific brain areas in order to create temporary lesions. I would predict that stimulating the angular gyrus would result in a transient loss of the sense of emotional meaning without affecting musical score reading, whereas stimulation of the adjoining supramarginal gyrus might have the opposite effect.
McDonald’s experience and the other case studies I’ve described highlight the significance of the parietal cortex as a sort of switchboard and sequencer, tasked with selecting and binding together those pieces of information that are required for fluid sequencing of actions. Twenty years ago, parts of the parietal cortex were recognized primarily for their critical role in visual attention, and neurological deficits in that area were associated with visuospatial neglect—the tendency to ignore the side of the space outside oneself that is opposite to the side of the lesion. With the advent of neuroimaging, it has become evident that the parietal cortex is involved in the more general directing of attention and that the process of selecting objects or locations on which to focus attention is closely related to working memory, which holds multiple representations in mind in the service of achieving some sort of goal.
For those who would like to ascribe a single function to a single brain region, this view of the parietal cortex as a multipurpose dynamic router might be too nonspecific and unsatisfying. But others, I among them, view the core mental operations at work as selecting, binding, and sequencing information. For us, the joy comes about in trying to understand the neuroanatomical and functional details of how and why specific pieces of information—such as pictures of musical notes, names of notes, and actions associated with notes—become bound in the way that they do.
Studying the functional organization of the brain provides constant reminders that the organization and manipulation of perceptions, thoughts, and actions are rarely as straightforward as we might hope or imagine. As we examine individual cases of people with brain damage, such as McDonald, and review functional neuroimaging studies that purport to examine unrelated unique phenomena, we are reminded that the brain’s processes underlying highly specific behaviors and skills are actually more intertwined than was at one time believed. By taking into account similarities across the multiple behaviors at which humans excel, such as music, language, mathematical reasoning, and planning, we stand a better chance of elucidating the universal principles of human brain function.
- McDonald, I. Musical Alexia with Recovery: A Personal Account. Brain 2006; 129: 2554–2561.
- Schön, D, Semenza, C, and Denes, G. Naming of Musical Notes: A Selective Deficit in One Musical Clef. Cortex 2001; 37(3): 407–421.
- Bevan, A, Robinson, G, Butterworth, B, and Cipolotti, L. To Play “B” but Not to Say “B”: Selective Loss of Letter Names. Neurocase 2003; 9(2): 118–128.
- Schön, D, Anton, JL, Roth, M, and Besson, M. An fMRI Study of Music Sight-Reading. Neuroreport 2002; 13(17): 2285–2289.
- Sergent, J, Zuck, E, Terriah, S, and Macdonald, B. Distributed Neural Network Underlying Musical Sight-Reading and Keyboard Performance. Science 1992; 257(5066): 106–109.
- Raichle, ME, and Gusnard, DA. Intrinsic Brain Activity Sets the Stage for Expression of Motivated Behavior. Journal of Comparative Neurology 2005; 493(1): 167–176.