A brain full of wisdom

Rethinking the aging brain


We all expect to lose a step or two with age, both physically and mentally. Our memory slips and we slow down; sometimes your keys end up in the freezer and the name of that movie with the skydiving surfers hangs recalcitrantly on the tip of your tongue, and you blame your senesced brain. That cognitive unwinding might even start by the time we’re 45—and we’re ominously warned that with increasing lifespans, dealing with the deteriorating skills of our elders promises to be the growth industry of the 21st century (that and medical waste disposal).

But what exactly happens when we age? Two fundamental measurements used by psychologists are accuracy (e.g., how well someone can recall a list of words) and reaction time (how quickly they respond when performing some task). Both are negatively affected by age. Older people tend to be slower than young people on most tasks that measure processing speed, and tend to be less accurate than young people on most tests of memory or attention. So the “cognitive decline” of age is this: older adults are slower and less accurate than younger adults on most cognitive tasks.

That all seems to support the notion that age simply gets the better of us. Physically, our bones get brittle and our joints looser. Mentally, our brains become less efficient, weaker, or lose some neurons and myelin—and we end up doing worse on memory tasks and forgetting the name of Point Break.

But a recent paper challenges that seemingly obvious conclusion: we might not be declining at all, they say. At least not the way we think. True, cognition seems slower and more error-prone in the elderly—but that’s because old people have accumulated a lifetime of knowledge, forcing them to sift through more extraneous information as they work to recall memories or search for the information they need. Slowdowns and inaccuracy may not be the ravages of age, but the inevitable consequences of a brain buxom with information from decades of learning.


Take, for example, the lexical decision task. In that task, participants see a string of letters and, as quickly as possible, must determine whether that string is a word. If they see “WHERE” they say ‘yes’, and if they see “WHODE” they say ‘no’. In general, old people are slower than young people at the lexical decision task. That slowdown is usually seen as a classic example of the kind of cognitive decline caused by age.

But this paper offers a different view. Vocabulary grows throughout our lives: a 70-year-old might know 15,000 words more than they did at twenty. When we do a lexical decision task, the brain isn’t taking the test word and comparing it one-by-one to each word in your vocabulary—it’s not that simple—but with all those extra words comes more interfering information. More words in your vocabulary are spelled like the test word, sound like the test word, or are semantically related to the test word, which makes it more challenging and uncertain and error-prone when you try to quickly determine whether “WHELP” is a word (of course, keep in mind that on a brain time-scale, “more challenging” means “takes 20 milliseconds longer”). Simply put: a big vocabulary should make lexical decisions take longer, and old people generally have big vocabularies.

To test that hypothesis, the authors developed a computational model (a neural network) meant to mimic a real person. The network is “trained” with words to develop a vocabulary, and, like a real person, can be given a lexical decision task—determining whether a string of letters matches a word found in its “vocabulary”. The test is straighforward: they train two networks. The first gets 20 years worth of words (based on how many words the average person sees/reads in a day), mimicking the vocabulary of a twenty year old. The second model is trained with 70 years worth of words. Then, each model is given a lexical decision task, and we look to see how their behavior compares to human performance.

In real life, we know that “olds” are slower than “youngs” at this task. And the neural network models show exactly that same pattern. The “age 70” model, with a 30,000+ word vocabulary, took longer to do lexical decisions than the “age 20” model, with a vocabulary of only around 20,000 words.* Because the two models differ only in how much they were trained, their difference in performance owes entirely to vocabulary size. Thus, what we think are age-related cognitive effects are instead knowledge-related: in lexical decision tasks, slowdowns can be caused by bigger vocabularies, not the physical deterioration of age. In fact, if we could accurately measure the size of a person’s vocabulary, we might find that people with large vocabularies are slower at lexical decision tasks, regardless of age.

*That’s a bit of a generalization. Older adults are actually only slower than younger adults when tested on rare and uncommon words. They do not differ when tested on common words. The models also show this pattern.

That result alone would be interesting, but the authors go on to describe similar evidence from entirely different tasks. One task is perceptual matching—determining whether a letter pair matches (Aa) or does not (AB). Older adults are slower at perceptual matching than younger adults, and this holds true for the trained models as well. Another example is memory for word pairs, like court-judge or book-cries. Older adults tend to have poorer memory for word pairs that are unrelated (like book-cries) than do younger adults, and this pattern is also demonstrated when comparing “old” models to “young” models. That effect, too, is caused by existing knowledge: older individuals have spent a lifetime learning that book and cries aren’t related, so to learn the new pairing they have to undo or ignore all those years of prior knowledge.

A final test addresses the biggest age-related cognitive bugaboo: remembering names. This one involves a nifty observation: the number of ‘possible’ first names has been steadily increasing for the last century or more. Thus, as a person gets older, we expect that they are developing a larger and larger “vocabulary” of names. And just like with lexical decisions, we should expect a person to get slower or less accurate remembering names over time, as their name vocabulary expands. And when they train a name-learning model, it shows this pattern. I like this particular example because it suggests an interesting prediction: in countries or cultures with smaller sets of possible names, is name-forgetfulness a less common complaint of the aging?

Taking a moment to summarize these findings: If we build computational models meant to mimic how the brain processes information, models that have more experience (they’ve “lived longer”) show the same response patterns as older adults. They’re slower to recognize words or names, slower to match letters, less accurate at remembering word pairs. But, crucially, what distinguishes these models is not age, but only the amount of information they have stored. And because of that, we are led to a conclusion: age-related cognitive decline may not owe to the insults of age per se—but to a brain laden with information.


That conclusion falls out from one simple insight: our cognitive abilities are affected by what we already know. And maybe a sub-insight: old people have accumulated a lot of knowledge simply by being, well, old. It’s a novel paper and a novel insight, even though many of the underlying concepts—for example, that having already learned a lot can make learning new information more difficult—are some of the oldest concepts in psychology and learning. And yet somehow, until now, their relationship to aging simply slipped through the cracks.

Many innovative ideas seem simple and obvious once suggested; this is no exception. Of course existing knowledge influences how we respond to new information; decades of research demonstrates that. Shouldn’t we have expected old people to have learned more than young people? Perhaps we so expected to see that age impairs cognition that we didn’t stop to consider the whys and hows. And it’s only now that we realize that maybe knowledge, not time, is what changes as we age.

I had a conversation recently about people with “perfect” autobiographical memories—the people that can tell you the weather and what they ate for dinner on any random date from their past. In contrast, the rest of us forget that information almost immediately. There’s this tendency, I think, to immediately wonder why our brains didn’t evolve to have perfect memory like those savants. Why bother forgetting things if we don’t have to?

But maybe our “forgetfulness” isn’t as much of a flaw as we think. What this study demonstrates—in several different ways—is that there’s always a cost to learning and remembering. At the simplest level, there’s the time it takes to learn something new. But dig deeper and consider how storing oodles of information can make it harder to retrieve the one thing you need, because there are so may extraneous things to distract from and obfuscate the information you’re after, like sifting through a bin of Legos looking for just the right piece. Memories like “I had pizza bagels for dinner last night” are usually ephemeral, but that isn’t necessarily a bad thing: forgetting about selfsame pizza bagels might keep our mental rolodex uncluttered and ensures that more important memories are easily accessible. Your ability to remember an alpaca-shearing Peruvian adventure might benefit from forgetting your Hungry Man dinner from last week.

The idea that “forgetting” might be a feature and not a flaw in how the brain manages memories also aligns with the types of cognitive deficits we see in the aged. One age-related cognitive deficit is in contextual memory: older people tend to remember fewer contextual details of events than young people. Like slowed response times, that’s viewed as evidence of advancing neuro-decrepitude. But maybe it’s not a failing. Maybe it’s a brain that knows those contextual details are often just embroidery, and has learned to simply crop out those minutiae—which isn’t usually a problem, until a researcher tests them on it.

Brain scientists sometimes treat deficits or improvements in abilities as occurring in a vacuum, but that’s rarely true. Often, we get worse in one area because we’ve improved in another. That kind of tradeoff often happens in cognitive skills, even if we don’t recognize it. People remember fewer peripheral details of emotional pictures than non-emotional ones. In one sense, that’s a deficit. But if we dig deeper, we see that it’s really a tradeoff: in emotional pictures, we trade off poorer memory for the peripheral details to improve memory for central details. Older adults, apparently, trade off a bigger vocabulary for being slower on lexical decision tasks. Not all skills are seesaws like that, but the brain can’t endlessly acquire knowledge and skills without modifying some others. Knowledge has a cost.

A few months back I read Neil Postman’s Amusing Ourselves to Death, in which he argues that television is narcotizing us into an intellectual stupor. Given that the sum total of human knowledge is increasing at an accelerating rate, and expertise is simultaneously becoming narrower and more difficult to achieve, my view is that we’re more likely knowing ourselves to death.

We’re bombarded by facts and knowledge and scientific evidence and longform journalism and mega-long Ken Burns documentaries. And the brain juggles many balls: maintaining information, storing new information, forgetting useless information, and assuring that needed information can be retrieved. I don’t think those are explicit “goals” for the brain, but they represent a fundamental balancing act of memory: a tension between how much can be learned, how much can be stored in the long-term, and what we can access when we need it. Maybe the so-called “decline” of aging is a warning sign about the effects of a world in which information is so plentiful and knowledge so abundant.

We should reconsider the idea of age-related cognitive decline. We somehow missed, all this time, that the passage of time isn’t the only thing that’s different between a twenty-year-old’s brain and a seventy-year-old brain: it’s in the accumulation of new memories, skills, and knowledge. Being slower at cognitive tasks or forgetting more names may be the price of that knowledge: in a very real sense, we can’t help but get wiser as we get older (and lifetime learning isn’t just a phrase used to sell college courses to retired folks). The very idea of treating the effects of aging as “decline” that can be “fixed” is misleading—the only fix is to stop learning. Getting older may not mean a slowing brain, but one running at the same speed through a bigger library.