Poet, naturalist, science writer, and adventurer Diane Ackerman brings to her latest book, An Alchemy of Mind, an ability to make the molecular biology of the brain vivid and charming (and yes, sexy) and to render the sensuous delights of a summer garden in terms so precise that reproduction of the experience under laboratory conditions may well be possible. The tour de force continues through 34 chapters on biology, cognition, personality, emotion, the senses, and consciousness that assemble the human brain according to the latest reports from the world’s neuroscience laboratories and, with an electrifying poetic touch, bring it to life. Our excerpt is from chapter 14, “Reﬂections in a Gazing Ball” and chapter 15, “Remember What?”
Excerpted from An Alchemy of Mind: The Marvel and Mystery of the Brain by Diane Ackerman. © 2004 by Diane Ackerman. Published by Scribner. Reprinted with permission.
REFLECTIONS IN A GAZING BALL
...One fall morning when I was six, I hurried through an orchard with three schoolmates. We were late for ﬁrst grade, and there were going to be silhouette drawings none of us wanted to miss. I remember the shiny plaid dress that Susan Green wore, her matching hair ribbon, and a petticoat that rustled as she moved. Ripening apples spiced the air with scent. High in the branches, dark plums huddled like bats. Susan dragged my arm because I’d slowed to stare at the plums, her eyes followed mine, and when she demanded to know what I was looking at, I told her. Suddenly she let go of my arm and all three girls recoiled. The possibility of bats didn’t frighten them. I frightened them. I looked at plums and saw bats. The alarm on their faces became an indelible memory, one colored by shame, and fused with a nearly levitating sense of wonder.
That vignette holds the kernel of many truths about memory. To remember, the brain does four things superbly: recognizes patterns, interprets them, records their source, and retrieves them. In the orchard, I saw abdominal apples, smelled their cork-like sweetness, heard the warble of children, held unforeseen feelings. At the time, brain receptors combined all the stimuli fast, busily interpreting what an orchard meant, the unique scent of apples, the similar architecture of plums and huddled bats, and noting the whereabouts as a walk to school, while it piled up emotions.
Now, half a lifetime later, only shards of that orchard memory ﬂoat into my ken— light swerving off a satin skirt, clouds caged in the branches of a tree, the hug of an elastic waistband on the corduroy pants I wore under my dress, the scratchy whispers of Susan’s starched petticoat. But I can forcibly capture others. Not the other girls’ names, not the cadence of their voices. Lots of details have faded through the normal vanishing and crumbling known as graceful degradation. Like an old photograph, my recall of that day has lost some of its color and clarity with passing years. But, unlike a photograph, the memory isn’t stored whole. It’s distributed throughout the brain and slowly dissolves, sheen by warble, whiff by shame.
Fortunately, memories nest in elaborate thickets of association. If I concentrate hard and insinuate myself back into that moment, into that body, peering out behind curly bangs, I can feel the swing of my ponytail, and my body ﬂushed with wonder the way a cube of sugar absorbs water. The sort of sugar cube with a slightly bitter pink splotch we received in squat, pleated white paper cups one day at school when the ﬁrst polio vaccines were distributed.
My memory caroms off another association. The sort of sugar cubes a stable-hand placed onto my trembling palm one summer, and a horse gummed off gently with velvety lips that tickled. Not just the horse’s lips but the whole experience tickled, and I laughed, which didn’t frighten the horse, used to the antics of pint-sized humans. Especially if they made high-pitched noises. Had I hissed or growled, the horse might have bolted. But in its long-term memory, it knew that baby horses whinnied a sprightly coloratura, and maybe that the helpless young of other mammals made high-pitched noises, too. If I’d been bitten by that horse, the event would have burrowed deep into my memory and advised my reﬂexes as well as my conscious mind. Even deeper in the horse’s long-term memory, and ours, lodges the sound of a cat’s claws scraping on a rock (why chalk squeaking on a blackboard chills us?), an alarming sound, an instant warning. An animal rarely gets two chances to survive a life-threatening attack. It must store the memory faster than a snake strike, and keep it on tap for a lifetime.
Indeed, pain and memory have a lot in common. Both change the NMDA receptors, allowing them to open more easily and stay open longer. Pain is like a bad memory that won’t fade, which makes good evolutionary sense since it’s not enough to identify danger, an organism also has to remember its features. Under general anesthesia, one may not be aware of pain, but the nervous system remembers it well. At the University of Pennsylvania Medical School, Allan Gottschalk and his colleagues have been administering “preemptive analgesia,” pain medicine beforehand, not just during and after an operation. Hoping to keep chronic pain memories from forming in the ﬁrst place, they block the nerve pathways leading from the injury site to the spinal cord. Thus far, it’s working well in prostate operations.
THE MESSAGE IN THE PATTERN
Back to the orchard. The long string of facts about apples, horses, orchards, and such involve semantic memory. Remembering that speciﬁc morning and those girls— on a day when the plums caught my eye, the clouds seemed caged by the branches, and I felt both shame and wonder—employs episodic memory. Combine the two and you get declarative memory, a declaration of events easy to slap a word on, the explicit truths. For both storage and retrieval, such memories rely on the sea-horse-shaped hippocampus (from the Greek for “sea monster”).
Knowing how may be the record kept by my body of what it was like to walk through the orchard. These subtle skills evade words, but the body remembers itself through a wardrobe of mannerisms, habits, preferences, gestures, biases, styles of talking and thinking.
Unconscious memory and knowing how something happened involves many areas of the brain, including various sensory and motor pathways. Knowing how may be the record kept by my body of what it was like to walk through the orchard. These subtle skills evade words, but the body remembers itself through a wardrobe of mannerisms, habits, preferences, gestures, biases, styles of talking and thinking. Perhaps a walk that’s more amble than stride. Or how, when nervous, you unconsciously pick at your cuticle. How volatile you become when provoked. How little it takes to provoke you. Such minutiae rule and deﬁne us, allowing a body to remember the individual it retains. We need to think about swimming while we’re learning, but afterward the body remembers how to ﬂoat, angling a hipbone just right, without consulting us.
There isn’t a single place, a memory-mine, where all the ore of experience lies buried. Different types of memory inhabit different parts of the brain, and groups of neurons combine to form a single memory. A good way to picture this is offered by the neurologist Jeff Victoroff, who suggests a football stadium at halftime:
Twenty thousand people sit across from you, each holding a colored card. At a signal, they ﬂip their cards into position, and the pattern spells out a message: “Go Trojans!” The idea “Go Trojans” is not written on any one of those 20,000 cards; it is not located at any one seat. It exists only as a pattern of activity, like the coordinated ﬁring of 20,000 neuronal responses. In the same way, memories are stored in our brains not in any one place but as a distributed network of neurons, primed to ﬂip their cards of synaptic activity in a coordinated way.
Continuing that metaphor, one excited person can somehow rally all the others. Stimulate one facet of a memory and the whole can suddenly pop into mind.
Locked forever inside the body’s ribbed prison, we can only know life through our avid, prowling senses. They report to a region near the hippocampus where their information creates a multifaceted picture of an event. After several more steps, the information returns to the hippocampus or to the neocortex for storage. We might store facial memories in the temporal lobes, a landscape in the parietal, and socializing in the frontal lobes. But, since we remember a whole event, not a spray of sensations, everything blends in the large association cortices that make up most of the neocortex.
We may be the only animals with this rich form of episodic memory, in which we can revive our past, play it back like a ﬁlm we stop to look at, enter imaginatively, and revise as we grow older.
Somehow, all these ways of remembering combine and we feel singular. Add enough pieces to the mosaic and an individual ﬁnds shape. We take for granted these dazzling skills, and the most treasured gift of all, being able to time-travel and explore the lost kingdom of yesterday. We may be the only animals with this rich form of episodic memory, in which we can revive our past, play it back like a ﬁlm we stop to look at, enter imaginatively, and revise as we grow older.
Say memory, and almost everyone thinks of the past. But most of our memories are really about the future. We quarry experience to help us solve present problems. A hungry squirrel needs to recall nuts buried at the base of a sycamore tree. A mother lion needs to recognize the scent of her cub. We need to remember a swindle or a bonus to foresee the outcome in a similar situation. Overlapping beacons of memory guide an animal through an ambiguous, confusing world, in which it runs a four-dimensional obstacle course of threats, injuries and challenges. It must survive long enough to mate and rear young who will pass the spiral baton of its genes and theirs to offspring in a relay race that began so long ago no one can remember who ﬁrst ran it, or where.
As I write this and you read it, we’re using a golden kind of short-term memory called working memory, a mental draft horse. Our working memory provides the general feel of our days: bite of lemon, skid of linen, tang of joy when a loved one wakes, repeating a telephone number we’re dialing, rehearsing what to tell the plumber, doodling a chickadee hanging upside down from an ice-glazed twig—all while receiving updates from the body on the career of one’s tummy, muscles or worry. Working memory holds crates of information for immediate use, but it can only do one thing at a time. Interrupt someone while she’s trying to remember a phone number and she’ll probably lose her train of thought, as we like to say, as if thought moved with locomotive force in a straight line on steel rails. Involving the frontal lobes (a little above and behind the eyebrows), working memory combines sensory news, the emotions that arouses, and our conscious effort to remember something. Or, if you like, every train of thought has many cars. In the brain, they link together before we can act on an impulse, solve a problem, talk to someone, feed ourselves, daydream. One stores facts. Yet another is devoted to learning skills. Another works the body’s muscles. Others haul such essentials as language, social protocols, sensory routines, biodegradable resources, operating guides, rules for executive functions and rules for grunts, and many different grades of memory. They gyrate together in a single train of thought.
Whenever we learn something, the brain mints new connections or enlivens old ones. A few times each summer, I spruce up the mulch pathways through my garden, making them more walkable. The brain renews familiar pathways, too. I ﬁnd this especially comforting in middle age, when my memory seems to be fading like an old picnic blanket, and recalling a simple word may take an annoying amount of time and effort. I can see the object in my mind’s eye; it just takes longer to fetch the word for it. Not a problem when I’m writing (because it normally requires an extra millisecond to escort a word from brain to paper), it can turn a simple conversation with oneself or someone else into a maze of circumlocution. Not being able to recall the word awning, for example, I might retrieve it by picturing and naming its associates at speed (porch, window, chaise, wicker, and so on) until the word appears under the same heading, or mental awning. Or I may furiously search for a synonym among awning look-alikes (drape, canvas, umbrella) making do with a close word instead. I may picture the house yawning outward and hunt the verb. This runaround can be worrisome and frustrating. It upsets the rhythm of quick elusive chat and backchat. Thus it becomes not so much a roadblock as a hurdle (a word I just had to grope for and only found by picturing an eponymous track-and-ﬁeld event). A personal myth is that such catastrophes appeared one day when the calendar began tearing off its own pages. But, of course, it always starts earlier, this angling for a sparkling word that darts away. Young children tend to recall events in lively detail, but that youthful gift already begins to pale at puberty, by which time life offers so many sensations that one can’t remember them all.
It’s right on the tip of my tongue, we say picturesquely, as if a moth were perching on the taste buds. Tip-of-the-tongue memory is more complicated than it seems. Remembering a word takes two steps, pinpointing the word you want and then fetching the sound code for the word. It’s possible to retrieve only the ﬁrst part, a semantic idea of a word, and not be able to remember its sounds, due to weak connections. Even though awning was hard to excavate, it did pull free. I could think and name it. We collected such memories at least once before, and now re-collect them like a basket full of mushrooms.
THE PLAGUE OF FORGETFULNESS
Without the brain’s temporal lobe, conscious recollection wouldn’t happen. That’s why Alzheimer’s, an illness that depletes the temporal lobe, begins slowly in a plague of forgetfulness easy to confuse with the forgetfulness that’s a normal bane of aging. In time, the disease invades other parts of the brain, attacking all forms of memory, until a recognizable self—one of memory’s better pranks—disappears like chalk underneath the brisk strokes of a blackboard eraser. “I am not in my perfect mind,” King Lear laments. “Methinks I should know you, and know this man; yet I am doubtful; for I am mainly ignorant what place this is, and all the skill I have remembers not these garments; nor I know not where I did lodge last night.”
Alzheimer’s, an illness that depletes the temporal lobe, begins slowly in a plague of forgetfulness easy to confuse with the forgetfulness that’s a normal bane of aging. In time, the disease invades other parts of the brain, attacking all forms of memory, until a recognizable self— one of memory’s better pranks-— disappears like chalk underneath the brisk strokes of a blackboard eraser.
The neurologist Spencer Nadler often receives e-mails from his patient Morris, who is in the early stages of Alzheimer’s and, miraculously it seems, can eloquently describe his loss of mind while it’s deteriorating. “Thoughts no longer percolate in my brain,” he conﬁdes. “They’ve slowed, become viscous.” Having the illness is not worse than knowing he has it. “Living with incurable, progressive dementia, the horrors of the fact and illness combined,” he writes to Nadler, “is like living in the aftermath of a nuclear war, surrounded on all sides by devastation and waiting for the radiation sickness to make you ﬁnally wither and drop.” More philosophical and upbeat than most would deem possible in such circumstances, he one day says in an e-mail: “What difference does a little dementia really make, when the greatest minds have struggled in vain to know themselves and others?”
About 4.5 million people now suffer with Alzheimer’s, and that number is estimated to climb to 16 million by 2050, when the baby boomers will enter their anecdotage. There’s early-onset and late-onset Alzheimer’s, with the early version being more hereditary. Surely a magic drug called something like abracadabra will unlock the rusty gates? The consensus is no, not yet, but someday soon. Meanwhile, drugs like Aricept oil the hinges (by restoring lost acetylcholine to the synapses) and laboratory wizards are targeting the disease from many angles, some focusing on the twisted tangles of tau proteins, others on the bull’s eye-shaped plaque.
Genes order the making of proteins, which are strings of amino acids that normally can loop, curve, twist, and even form pleats, through a million gyrations, before reaching their ﬁnal form. But their shape dictates how they’ll behave in the body, so the strings must fold correctly, and many don’t. What to do with the tangled protein strings? Cells have protocols for dealing with them, chaperone molecules to surround and protect them, others to tag them with the amusingly named “ubiquitin” and drag them off to slice up and recycle. But the janitorial system doesn’t always work. “A certain amount of misfolding is ﬁne,” says Fred Cohen, of the University of California at San Francisco. “The cell can handle the trash. But if there’s a garbage strike, the trash on the sidewalks begins to stink. That’s what we’re dealing with.” Some scientists believe that Parkinson’s, Alzheimer’s, mad cow disease, and many other scourges are misfolded protein diseases. In Alzheimer’s, thanks to various saboteurs—such as enzyme beta-secretase, which clips proteins sticking out from the cells, leaving stumps in a muddle—clusters of deformed proteins clog up regions of the brain involved with memory, location, and mood. Why that happens isn’t clear—genes? exposure to toxins? inﬂammation? trauma? But people with larger brains, more education, or complex thought seem to resist its symptoms longer.
What’s come to be known as the Nun Study recently offered intriguing clues to Alzheimer’s. Although epidemiologist David Snowden of the University of Kentucky limited his study to a small homogeneous group of nuns (the School of Notre Dame), it boasted 90 percent accuracy, and suggested that one’s thinking style while young can predict the disease in old age. Analyzing the nuns’ writings as young women, before they took their vows, Snowden and his colleagues found that those using the simplest sentences with the fewest ideas were the ones most likely to develop Alzheimer’s later in life. What are we to make of this? Maybe just that people with lots of cell connections fare better because they can afford to lose more. Or does nutrition play a role? The Nun Study also discovered that the cerebral cortex weakened in sisters with low levels of folic acid.
Although the brain ages dramatically like the rest of the body, old age doesn’t guarantee senility. Retrieving a particular word may take longer, but vocabulary can actually improve. There’s lots of evidence that seasoned skills age more slowly. Habits of mind linger, especially expertise. Anton Bruckner continued composing lush melodic symphonies into his seventies and eighties. I’m always pleased to see nonagenarian physicist Hans Bethe (among a great many accomplishments, he ﬁgured out how the sun works) strolling in the Cornell arboretum with his wife, friends, and colleagues. At ninety, he signed a ﬁve-year contract, and continues to stay politically and professionally active. As the most senior living scientist who worked on the atomic bomb at Los Alamos in the 1940s, he’s been campaigning ever since for nuclear disarmament. Much of each day he happily devotes to the study of exploding stars. Yet other densely imaginative thinkers—Iris Murdoch comes sadly to mind—have endured the mental deforestation of Alzheimer’s. Both Bethe and Murdoch spent their lives juggling complex sentences and ideas.
At the local airport ten years ago, I happened to be in line behind Bethe (pronounced BAY-ta) at the ticket counter and overheard the clerk say to him, slowly and in a louder voice than needed, as if he were carrying an invisible ear trumpet and must, at his age, be lost in senility:
“Now, Mr. Beth-ee, you’ll be arriving at gate 21 in Pittsburgh and going to gate number 27. That’s six gates away.”
A small bemused smile ﬂitted across Bethe’s face. “Oh, I think I can do the math,” he said.
Most everyone I know frets about memory loss, and, in what’s become a mass phobia, worries whether each slip foretells the reign of Alzheimer’s. Maybe because it can be so embarrassing, people seem especially bothered when they forget familiar names. I know several couples who have chosen a help-I-can’t-remember-thename signal; the other quickly introduces him/herself and hopes the person the couple has just bumped into will do the same.
We complain about normal forgetfulness, but thank goodness we don’t have better memories. We aren’t required to remember how to operate our bodies, for instance, or even the full carnival of sense impressions spawned by a single moment.
We complain about normal forgetfulness, but thank goodness we don’t have better memories. We aren’t required to remember how to operate our bodies, for instance, or even the full carnival of sense impressions spawned by a single moment.
Remember every slight and insecurity since childhood? People cursed with comprehensive memories have minds like overstuffed closets—open the door and an avalanche pours out. A simple choice can balloon into a nightmare of competing outcomes. Not a good survival plan. Forgetting isn’t the absence of remembering, it’s memory’s ally, a device that allows the brain to stay agile and engaged.
Running a peak-performance animal is expensive, so the body devotes only the ﬁrst half of life to high-octane survival and breeding skills, which include salting away lifesaving memories. After childbearing, memory begins to deteriorate, as does the rest of the body—joints grow less supple, taste buds less sensitive. In the evolutionary scheme of things, infertile humans are socially helpful but not essential, and they do compete with the young for food and shelter. But we’re willful beasts who don’t always play by evolution’s rules. We insist on learning them, though. Part of our great charm as a species is our passion to understand everything that touches our lives. Solving mysteries is the brain’s fetish and pastime, offering a special caliber of pleasure that falls between thrill and relief. We do it for survival, we do it for work, we do it for play. Fortunately, nature holds more secrets than we can unpuzzle.
Solving mysteries is the brain’s fetish and pastime, offering a special caliber of pleasure that falls between thrill and relief. We do it for survival, we do it for work, we do it for play. Fortunately, nature holds more secrets than we can unpuzzle.
HOW TO MAKE A MEMORY
But some are yielding. One mystery that’s tantalized people for ages is how the brain embeds long-term memories. As I mentioned earlier, brain cells communicate by sort of shaking hands at hundreds of billions of minute contact points called synapses, slender inlets between neurons. To store a long-term memory, a cell lathers its handlike axon with more protein, which strengthens its grip. It may enlarge synapses or create doubles or triples of the synapses already there. But it’s an elaborate process, and for good reason. Short-term memories may swarm and vanish, but if we clutter our minds with them for long, we drown in a sea of formed and forming memories. To prevent that, installing a long-term memory requires the safety lock of simultaneously switching on some genes while switching off others. The brain checks and double-checks. Eric Kandel, of Columbia University, theorizes that age-related memory loss might have to do with a defective genetic switch, so that short-term memory doesn’t convert to long-term with a spurt of new proteins. And some people who seem to have an extraordinary memory may simply have a defective part of the switch that inhibits genes. In the mansions of the mind, lamps are mistakenly left on because the switches are broken, so they continue casting light.
In 1949, the Canadian psychologist Donald O. Hebb proposed that memory stems from neurons working in unison to strengthen the synapse where they meet. We’re sociable beings, even on the cellular level where active neurons cement their mutual bonds, forming “little cliques, or social clubs, within the brain,” as the neurosurgeon Frank Vertosick puts it. They really are social clubs, societies of cells, and some will be inﬂuenced more than others by lobbying neurons. Just as in a human society, the majority reaches a decision, despite naysayers. But altruistic neurons don’t act sensibly for the good of their little clique. They act separately, selﬁshly, to promote their own genes, oblivious to the others, and it doesn’t matter a whit if some aren’t on the bandwagon.
I prefer to think of the Hebb rule as a bedroom drama. The more often a neuron excites another neuron the easier that becomes. The more two neurons excite one another, the tighter their bond grows. The reverse is also true: a neuron inhibiting another weakens their bond. Neurons love to be turned on, to feel alive, and they prefer exciting contacts over those that turn them off. A neuron doesn’t turn on instantly, it requires a little buildup until a threshold is reached. Once aroused, it ﬁres brieﬂy down its length, is drained of its juice, then must rest a while and recharge before it can perform again. If a partner is still very excited, it may respond by ﬁring again and again; if not, it may quiet down. In neuroscience parlance, neurons that ﬁre together wire together.
Extending this idea in 1973, Timothy V. P. Bliss and Terje Lømo of the University of Oslo discovered that if they stimulated nerve cells in the hippocampus with high-frequency electric pulses, the cells became tightly linked. That stronger grip, known as long-term potentiation, can hold a memory in readiness for hours or weeks. Later studies by other scientists revealed that applying low-frequency pulses to hippocampal pathways weakened the connections. So, as I’ve said, many believe this two-step of strengthening and weakening of synapses allows various precincts of the brain to store and erase information of various sorts. In the lima-bean-size amygdala, for instance, memory is most likely involved with emotion, since part of the amygdala’s role is to link emotions and experience.
In the 1980s and 1990s, several researchers identiﬁed doughnut-shaped molecules (NMDA receptors) as key to the cell changes. Seated in the outer wall of certain neurons, they’re essentially coincidence detectors. Their central doorway is closed with a double lock. If transmitting and receiving neurons ﬁre simultaneously, the door opens and current ﬂows in, helping the brain associate two events. Timing is everything. In older animals, two signals must arrive almost simultaneously for the cell to open its gate, but in younger ones signals can be relatively far apart (a tenth of a second), which may explain the rich and enduring memories of the young, while adults ﬁnd it hard to learn new things. It’s tempting to imagine two signals colliding on the doorstep—a biophysical open sesame —and then, door agape, memory pouring in like cement.
But as Larry Squire and Eric Kandel remind us in their overview of the ﬁeld, Memory: From Mind to Molecules, a memory isn’t instantly engraved, but takes time and several steps to embed:
Until the process is fully completed, memory remains vulnerable to disruption. Much of this process is completed during the ﬁrst few hours after learning. But the process of stabilizing memory extends well beyond this point and involves continuous changes in the organization of long-term memory itself.
As time passes, the hippocampus plays a lesser role and memories gradually join others in different parts of the brain, forming strata of belief about the world and oneself. The ﬁrst time you hear a soprano singing the beautiful lyric “The Sun, Whose Rays” from The Mikado, you may not recognize it. But after being repeatedly rewarded for associating those sounds with Gilbert and Sullivan—rather than with a wren’s song or jackhammer—some synapses ﬁnd it familiar. Their doors quickly open and you think, Mikado.
As time passes, the hippocampus plays a lesser role and memories gradually join others in different parts of the brain, forming strata of belief about the world and oneself.
Not long ago, the molecular biologist Joe Z. Tsien of Princeton University doctored the genes of mice to see if he could improve their memories. First he bred mice without a vital part of their NMDA receptors in the hippocampus. Those mice had weaker synaptic connections and poorer memories. Then he reversed the experiment by enlivening the receptors. Sure enough, it created brainier mice which he named Doogie, after the young genius on the TV sitcom Doogie Howser, M.D. The receptor doors stayed open only 150 thousandths of a second longer than usual, but that brief welcome boosted memory and intelligence ﬁvefold. Suddenly the mice could learn all sorts of things faster—emotional, spatial, and auditory—and retain them longer.
On the way outside, I leave the screen door open an extra slice of a second. It seems simple and fast, yet it’s long enough for one of the resident hummingbirds to ﬂy into the house and make mischief, or a June bug to clatter around the living room like a windup toy, or a triangular stinkbug to sneak in like a delta-wing bomber. We’re used to gross amounts of time, like stores remaining open half an hour longer, or pearl divers holding their breath an extra minute, or locking eyes with someone for whole seconds. In one second a tuning fork set to A above middle C will vibrate forty times, a human heart will beat once, a moonbeam will almost reach the Earth, Americans will eat 350 slices of pizza. An average houseﬂy ﬂaps its wings every three thousandth of a second. So 150 thousandths of a second is hard to imagine, let alone its lifelong effect on someone’s destiny. One unstated innuendo is that animals can be smarter, that they’re not fully evolved yet. If that’s true of mice, it’s probably true of humans. In future days, will our descendants regard us as primitive lowbrows much as we regard the Cro-Magnons?
Tsien is often asked if we can engineer smarter children, produce a tribe of geniuses. “The short answer is no,” he says, “and would we even want to?” Many parents would. But intelligence hinges on an assembly of talents, not just good memory, dexterity, or a knack for problem solving. Different animals need to be smart in different ways. Whales, pigeons and butterﬂies are astonishing navigators; dogs can detect a person’s essence in his footprints, picking up scent molecules that have seeped through shoe soles. But boosting their IQ wouldn’t make them good at nuclear physics or the tango. In a way, one-celled organisms have a truer sense of the world, because they respond to every stimulus they encounter, whereas higher animals like us select very few stimuli from the vast array.
HOW MUCH IS ENOUGH?
In any case, a better memory isn’t a more useful memory. Studies show that the IQ range of most creative people is surprisingly narrow, around 120 to 130. Higher IQs can perform certain kinds of tasks better— logic, feats of memory, and so on. But if the IQ is much higher or lower than that, the window of creativity closes. Nonetheless, for some reason we believe more is better, so people yearn for tip-top IQs, and that calls for bigger memories. A fast, retentive memory is handy, but no skeleton key to survival. I’d add “or to happiness,” but survival doesn’t guarantee it any more than our founders did in the Constitution. All those gents promised was the right to pursue happiness.
Because IQ tests favor memory skills and logic, overlooking artistic creativity, insight, emotional reserves, sensory gifts, and life experience, they can’t really predict success, let alone satisfaction. Some theorists suggest other ways to portray intelligence. For example, Daniel Goleman writes persuasively about our powerful emotional intelligence that colors all our decisions. An agile memory doesn’t guarantee balanced intelligence (consider idiot savants), but one can’t be very intelligent without a good memory.
Rote learning works well, but it’s rigid. It doesn’t allow creative solutions to bubble up through the magma on the spur of the moment. When a skill becomes automatic, the swampy path that led to it disappears, and it’s hard to make adjustments that might improve one’s gait or route.
Suppose I don’t want to doctor my genes? Repeating something helps to store it long term. The dreaded tedium of rote learning, whose epitome I suppose is having to confess your bad behavior a hundred times on the blackboard, works tiresomely well. Athletes and surgeons alike develop mastery through repetition, a popular and reliable way to remember. On the other hand, tedium bars richer possibilities. Rote learning works well, but it’s rigid. It doesn’t allow creative solutions to bubble up through the magma on the spur of the moment. When a skill becomes automatic, the swampy path that led to it disappears, and it’s hard to make adjustments that might improve one’s gait or route. A theater director I know advises his actors to prepare lavishly through study and practice, then put habitual thought aside, risk spontaneity, and see what happens.
I use a skill devised by the Greek poet Simonides in 500 B.C. Noticing the brain’s natural tendency to ﬁsh for memories by way of associations, he would decorate a person’s name with a colorful image. Attaching a clever hook to the slippery name makes it easier to grab and haul into view. I may choose rhyming slang, a ghoulish cartoon of one feature, a clue based on personality. These aren’t always delicate or civilized tags, so I don’t tell people about them. Using such a memory aid means adding an extra scintilla of time between saying “Hi” and adding “Jane.” But it usually works, and it’s fun, if you remember not to giggle when you say “Hello, Lester,” while thinking he’s swollen like a blister, or Ginny likes to drink gin at lunch. If Brenda is a redhead, I might picture her hair on ﬁre, because I know the word Brenda, cognate with the German brennen, hides the idea of burning at its root, or roots. Holly I might remember by her halo of curly hair. Paul’s broad shoulders might remind me of a wall. One semester I began a course by conﬁding this technique, and we went around the room confecting funny images for each name. For once, everyone knew everyone else’s name from the start. Imagination’s muscle grows stronger with use, but toning memory calls for the ingenuity of Simonides. Memory techniques, though fun at times, take work, and we lazy sorts hate to bother. Anyway, the sweetest memories arrive unbidden, as gifts given, like ﬂying ﬁsh leaping into the sunlight for a moment before plunging back into the dark, churning sea of the unconscious.
Memory techniques, though fun at times, take work, and we lazy sorts hate to bother. Anyway, the sweetest memories arrive unbidden, as gifts given, like ﬂying ﬁsh leaping into the sunlight for a moment before plunging back into the dark, churning sea of the unconscious.
“Don’t do an all-nighter before a test” is good advice, since many studies show that sleep improves recall. Sleep may aid memory through a process known as interleaving— gradually introducing the information through repetition, rather than just dropping it as a chunk the brain has to assimilate. In this way, folded like egg whites into the batter, the new information doesn’t disturb the old, it simply adds to it.
Cumin, an anti-inﬂammatory spice central to Indian cooking, may help memory. Traditional herbalists have always used melissa, also known as lemon balm, to improve memory. Lemon balm is a lovely garden herb with soft leaves and a mild lemon smell and ﬂavor—in contrast to lemon verbena, whose scent and ﬂavor is stronger and whose leaves feel raspy. I prefer the smell of verbena, but often add lemon balm to tea. Researchers at the Human Cognitive Neuroscience Unit at Northumbria University have isolated the herb’s active ingredient, methanolic extract, which boosts the level of the neurotransmitter acetylcholine. They tried it out on volunteers; others received a placebo. Doses of about 1,600 mg produced “striking” results: a combination of calmness and improved memory.
One thing is certain: memory suffers when we’re under stress. Both stress and tedium can kill brain cells. Challenge, novelty, and rich environments can rejuvenate memory. So does gentle aerobic activity, and, quite possibly, eating a cup of blueberries each day. So, if one needed an excuse to adopt a fascinating hobby, work on novel projects, or live outrageously, one could claim to be doing it for health purposes. How does this work? In the forest of neurons, transmitter chemicals leap about like capuchin monkeys. Stimulate the brain and the neurons grow new branches, which makes it easier for the monkeys to travel from one tree to the next. Fortunately, this growth process isn’t limited to one’s ﬁrst three years but continues over a lifetime. Maybe one can’t teach arthritic legs to sprint. But one can always teach an old dogma new tricks.