- Home
- David Shenk
The Forgetting Page 5
The Forgetting Read online
Page 5
The ant farm analogy also applies in another important way: Neurobiologists have found that memory formation is slow. Long-term memories can take many months or even years to fully form.
Long-term memories are durable, but not unassailable. They can last a lifetime, but from the first moments are subject to influences from other memories and experience. Inevitably, as they age and are evoked again and again, all memories change in character.
This is part of the brain’s famous plasticity, its ability to adapt to life’s events. Plasticity makes us as much creatures of our own experience as we are products of evolution. Not everything in the brain is adaptable, of course; much of it comes “hard-wired,” genetically preprogrammed to specialize and perform specific tasks such as processing light and sound, regulating heart rate and breathing, and so on. But the regions reserved for fine motor skills, intelligence, and memory are more like soft clay, able to take on a definite shape and yet remain constantly responsive to new stimuli.
Memory constellations, then, are not fixed, immutable collections of memories, but ever-variable collections of memory fragments that come together in the context of a specific conscious moment. Any common free-association experiment is a vivid illustration of this point. For me, at this moment, the word “cat” prompts → a thought of Brownfoot, my boyhood feline friend → the garage roof she used to leap from → the 1971 T-top Corvette my father used to drive → the tragicomic month in which Mom wrecked this car twice → a feeling of malaise associated with my parents’ divorce years later. This instant montage of memories is neither chronological nor predictable, even by me. If someone were to prompt me with “cat” tomorrow, depending on my mood or recent experience, I might think of the cat that my daughter called to yesterday outside our house. Or it could be that Brownfoot will come to mind, but that from there I will shift to an image of my playing her dentist, and then I might think of my own current dentist and how I’m way overdue for a cleaning. That guilty feeling might then trigger another distant idea, related only by a parallel feeling of guilt. And so on.
Taken together, this interconnected universe of constellations in each of us forms the core of who we are. Our life’s ocean full of memory waves wash against one another to create a complex and ever-adapting character.
The director Martin Scorsese is an interesting memory-character study, mostly because he seems to forget very little compared to others. He remembers not just every shot and crew credit from each of the thousands of movies he’s seen, observes the New Yorker’s Mark Singer, but also every detail of every book, song, and personal experience he’s had in fifty-plus years—“all of it,” Singer writes, “seemingly instantly retrievable.”
Singer depicts the Scorsese memory constellation in action. After a colleague criticizes a piece of film dialogue as “too piercing,” Scorsese is instantly thrown into an interconnected memory odyssey:
He was reminded of the old Harry Belafonte calypso tune “The Banana Boat Song”—or, rather, a parody of same by Stan Freberg, which included a reference to “piercing,” and that reminded him of another Freberg routine, a parody of the television series Dragnet, which in turn reminded him of Pete Kelly’s Blues, a feature film directed by Jack Webb, the star of Dragnet. The production designer of Pete Kelly’s Blues, in which Webb played a bandleader during the twenties, was a Disney veteran who brought to it a remarkably vivid palette, a reality-heightening Technicolor glow reminiscent of the live-action Disney children’s films of the forties.… And, Scorsese further recalled, Pete Kelly’s Blues had a screenplay by Richard L. Breen, whose name, curiously, Webb had heralded before the title. When the picture was released, in 1955, the year Scorsese turned thirteen, he followed it from theatre to theatre, as was his habit.… [He then recalled all the specific theaters he used to frequent.] One particular Saturday afternoon double-feature at the Orpheum came to mind: Bomba the Jungle Boy and Great White Hunter.…
The pathways linking engrams can be built on temporal, intellectual, or aesthetic associations, and when the mind really wanders, during daydreams or at night before sleep sets in, it’s amazing what sort of involuntary memory leaps one makes, from impressions that often have no logical or logistical relationship but which share a texture or smell or emotional fragment. What’s more—and this may be the single most important point to understand about memory—every time a memory is recalled, new trails are made.
The act of remembering itself generates new memories. Which means that Emerson was exactly right when he noted in his journal: “Most remembering is only the memory of memories, & not a new & primary remembrance … HDT [Henry David Thoreau] noticed this to me some time ago.” Overlap, in other words, is not only built into the biology of memory. It is the very basis of memory—and identity. New memory traces are laid down on top of a foundation of old memories, and old memories can only be recalled in a context of recent experiences. Imagine a single painting being created over the course of a lifetime on one giant canvas. Every brush stroke coming into contact with many others can be seen only in the context of those prior strokes—and also instantly alters those older strokes. Because of this, no recorded experience can ever be fully distinct from anything else. Whether one likes it or not, the past is always informed by the present, and vice versa.
Scores of experiments confirm the malleability of old memories, and horror stories of False Memory Syndrome are by now widespread. The psychologist Elizabeth Loftus has spent the better part of her career documenting the ease with which false memories can be planted—accidentally or on purpose. Often, these false memories lead to wrongful convictions. In 1979, twenty-two-year-old marine corporal Kevin Green was convicted of second-degree murder for the brutal beating of his wife and the death of their full-term fetus. His wife had testified after coming out of a coma that Green, her own husband, was the attacker. Sixteen years later, the real attacker, a total stranger, confessed to police about that and six other murders. It turned out that Green’s guilt had been suggested to his wife early on in her rehabilitation. By the time it came to trial, she had created a memory so clear that she was able to confidently testify against her husband.
“Eyewitness misidentification … is known as the single greatest cause of the conviction of the innocent,” says attorney Barry Scheck. He describes a typical scenario: “You can have as many as five witnesses who begin in kind of a soft way, saying, ‘That might be the guy,’ and then, like wet concrete hardening, the [memories] get fixed to the point that by the time they get to the courtroom, they’re saying ‘That’s the man.’”
Part of the deep attraction to the idea of distinct memory molecules was that it connoted the ability to replay old memories like videotapes on a VCR—just as they were originally recorded. But the biology of memory constellations dictates that there is no such thing as pure memory. Recall is never replay.
But why? Why would millions of years of evolution produce a machine so otherwise sophisticated but with an apparent built-in fuzziness, a tendency to regularly forget, repress, and distort information and experience?
The answer, it turns out, is that fuzziness is not a severe limitation but a highly advanced feature. As a matter of engineering, the brain does not have any physical limitations in the amount of information it can hold. It is designed specifically to forget most of the details it comes across, so that it may allow us to form general impressions, and from there useful judgments. Forgetting is not a failure at all, but an active metabolic process, a flushing out of data in the pursuit of knowledge and meaning.
We know this not just from brain chemistry and inference, but also because psychologists have stumbled upon a few individuals over the years who actually could not forget enough—and were debilitated by it.
In his New Yorker profile, Mark Singer wonders if Martin Scorsese is such a person—burdened by too good a memory.
Was it, I wondered, painful to remember so much? Scorsese’s powers of recall weren’t limited to summoning plot turn
s or notable scenes or acting performances; his gray matter bulged with camera angles, lighting strategies, scores, sound effects, ambient noises, editing rhythms, production credits, data about lenses and film stocks and exposure speeds and aspect ratios.… What about all the sludge? An inability to forget the forgettable—wasn’t that a burden, or was it just part of the price one paid to make great art?
For some perspective on the inability to forget, consider the case study that psychologists call S. In the 1920s, S. was a twenty-something newspaper reporter in Moscow who one day got into trouble with his editor for not taking notes at a staff meeting. In the midst of the reprimand, S. shocked his boss by matter-of-factly repeating everything that had been said in the meeting—word for word.
This was apparently no stretch at all for S., who, it emerged upon closer examination, remembered virtually every detail of sight and sound that he had come into contact with in his entire life. What’s more, he took this perfect memory entirely for granted. To him, it seemed perfectly normal that he forgot nothing.
The editor, amazed, sent S. to the distinguished Russian psychologist A. R. Luria for testing. Luria did test him that day, and for many other days over a period of many decades. In all the testing, he could not find any real limit to his capacity to recall details. For example, not only could he perfectly recall tables like this one full of random data after looking at them for just a few minutes:
And not only could he efficiently recite these tables backwards, upside down, diagonally, etc., but after years of memorizing thousands of such tables he could easily reproduce any particular one of them, without warning, whether it was an hour after he had first seen it, or twenty years. The man, it seemed, quite literally remembered everything.
And yet he understood almost nothing. S. was plagued by an inability to make meaning out of what he saw. Unless one pointed the obvious pattern out to him, for example, the following table appeared just as bereft of order and meaning as any other:
“If I had been given the letters of the alphabet arranged in a similar order,” he remarked after being questioned about the 1–2–3–4 table, “I wouldn’t have noticed their arrangement.” He was also unable to make sense out of poetry or prose, to understand much about the law, or even to remember people’s faces. “They’re so changeable,” he complained to Luria. “A person’s expression depends on his mood and on the circumstances under which you happen to meet him. People’s faces are constantly changing; it’s the different shades of expression that confuse me and make it so hard to remember faces.”
Luria also noted that S. came across as generally disorganized, dull-witted, and without much of a sense of purpose or direction in life. This astounding man, then, was not so much gifted with the ability to remember everything as he was cursed with the inability to forget detail and form more general impressions. He recorded only information, and was bereft of the essential ability to draw meaning out of events. “Many of us are anxious to find ways to improve our memories,” wrote Luria in a lengthy report on his unusual subject. “In S.’s case, however, precisely the reverse was true. The big question for him, and the most troublesome, was how he could learn to forget.”
What makes details hazy also enables us to prioritize information, recognize and retain patterns. The brain eliminates trees in order to make sense of, and remember, the forests. Forgetting is a hidden virtue. Forgetting is what makes us so smart.
One of the worst things that I have to do is put on my pants in the morning. This morning I kept thinking there is something wrong because my pants just didn’t feel right. I had put them on wrong. I sometimes will have to put them on and take them off half a dozen times or more.… Setting the washing machine is getting to be a problem, too. Sometimes I’ll spend an hour trying to figure out how to set it.
—B.
San Diego, California
Chapter 4
THE RACE
Taos
“Ten years to a cure,” a Japanese scientist whispered to me in our hotel lobby as we waited for the shuttle bus to the Taos Civic Plaza.
The whisper was as telling as the words. He couldn’t contain his optimism, and yet he also couldn’t afford to put it on display.
Other Alzheimer’s researchers had lately been adopting a similar posture. As scientists, they were reserved by nature. But the recent acceleration of discovery had made them a little giddy. Hundreds of important discoveries had come in recent years, and funding for research was way up. The study of Alzheimer’s was now in the top scientific tier, alongside heart disease, cancer, and stroke research. This seemed fitting, since the disease was emerging as one of the largest causes of death in the U.S., not far behind those other three.
There was now even an Alzheimer’s drug on the market. Aricept, introduced in 1997, which boosted the brain’s supply of the neurotransmitter acetylcholine. Some of the functional loss in early Alzheimer’s involves a deficiency of acetylcholine; replenishing it with this drug seemed to help about half of early and middle-stage patients to slow or even arrest the progression of symptoms for a year or more.
On the one hand, this was a giant advance: a real treatment that often made a tangible difference. But it was also a frustrating baby-step: Aricept did not slow the advance of the actual disease by a single day. It only worked on the symptoms. Scientists couldn’t stop Alzheimer’s yet—only put a thick curtain in front of it for a while.
More ambitious advances were brewing. An electronic update service named Alzheimer’s Weekly had been launched in 1998. Neurologists in the 1960s would have considered this phrase a sarcastic reference to the drudging nature of discovery: Understanding of the disease was practically frozen for more than half a century. But after a thaw in the 1970s and a renewed effort in the ’80s, genetic and molecular discoveries started to cascade so quickly by the mid-1990s that the excavation of Alzheimer’s seemed to be moving at the same clip as sporting events and financial markets.
Now a weekly update was not only useful but essential. In fact, updates on other Web sites came almost daily:
News from the Research Front
3 September 1998. H. J. Song et al. report that they are able to manipulate growth cones …
5 September 1998. Puny polymer pellets show promise as a vehicle for delivering nerve-growth factor to the basal forebrain …
6 September 1998. A novel brain-imaging agent promises to open up a window on the functioning of the brain’s dopamine system …
10 September 1998. Findings published in Nature Neuroscience indicate that the accumulation of calcium in the mitochondria triggers neuronal death …
10 September 1998. C. Y. Wang et al. report they have identified four genes that are targets of NF-kB activity …
11 September 1998. E. Nedivi et al. describe CPG15, a molecule that enhances dendritic arbor growth in projection neurons …
—from the Alzheimer Research Forum (at www.alzforum.org)
The research was so intensely specialized that few individual scientists appeared to even be working on the problem of Alzheimer’s disease per se. It was more like each was unearthing a single two-inch tile in a giant mosaic. By themselves, these individual experiments were so narrowly focused that they were far removed from a comprehensive understanding of the disease. But the minutiae had a purpose. If the great challenge of Alois Alzheimer had been to distinguish a general pathology of dementia from the normal cells of the brain, the task of contemporary scientists—employing exotic techniques with names like fluorescent protein tagging, immuno-lesioning, and western blot analysis—was to try to see what the process looked like in flux. Alzheimer glimpsed a mono-colored, silver-stained microscopic snapshot. Contemporary scientists, crunching and exchanging data with parallel processors and fiber optics, were trying to patch together more of a motion picture. Once they understood the actual disease process, particularly the early molecular events, they hoped they would be able to proceed toward genuine therapies.
The resea
rch had expanded in every direction, and had also gone global. Thousands of scientists from every continent now worked on the problem, as time became critical. In a little over a decade, the much-anticipated “senior boom” would begin, eventually quadrupling the number of Alzheimer’s cases and making it the fastest-growing disease in developed countries. In addition to the sheer misery, the social costs of such a slow, progressive disease would be staggering. In the U.S., the costs of doctor’s visits, lab tests, medicine, nursing, day care, and home care was already estimated to be $114.4 billion annually. That was more than the combined budgets for the U.S. Departments of Commerce, Education, Energy, Justice, Labor, and Interior.
“We have to solve this problem, or it’s going to overwhelm us,” Zaven Khachaturian said. “The numbers are going to double every twenty years. Not only that: The duration of illness is going to get much longer. That’s the really devastating part. The folks who have the disease now are mostly people who came through the Depression. Some had college education, but most did not. The ones who are going to develop Alzheimer’s in the next century will be baby boomers who are primarily much better educated and better fed. The duration of their disability is going to be much longer than the current crop. That’s going to be a major factor.
“See, in considering the social impact of the disease, it’s not so much the pain and suffering that matters. From the point of view of the individual, that is of course the important factor. But from the point of view of society, what’s important is how long I am disabled and how much of a burden I am to society. With cancer and heart disease, the period where I cannot function independently is fairly short—three to five years. With Alzheimer’s, it’s going to be extremely long—like twenty years, where you are physically there, you don’t have any pain, you appear normal, and yet you have Alzheimer’s. You cannot function independently.”