Skip to content

This Is Your Faulty Brain, On a Microchip

Take a good look at the downward trend of this graph—it’s important. It’s the reason why you’re only getting worse at first-person shooters and why you never feel as sharp as you were yesterday. It’s the human condition.

Your Mind Is Declining

Starting in your 20s—not old age—behavioral evidence suggests that you enter a linear cascade of general cognitive decline. (Yes, it’s depressing. No, the claim isn’t based upon some quack study.)

This decline is notably seen in tasks that are highly mentally demanding, like speed of processing (how quickly you handle incoming information), attention, working memory (how well you manipulate and keep information active in your mind), and, of course, long term memory.

In real life, these effects are seen in everything from how long it takes to learn a new skill to how quickly you can recall a factoid. They’re with us all day, every day.

Humans, of course, are adaptive creatures, and the human mind is the most incredible biological machine in existence. All hope is not lost. We already develop coping mechanisms, and expanded experience often minimizes the impact of our declining cognition. (Experience is represented by “world knowledge” on the graph above.) But in an era during which anything seems possible, could we significantly alter the course of this graph?

Could we make the three green, three blue and four grey lines stay level…or even go up?

What You Can Do About It Now

There’s a simple mantra in circles of cognition psychologists: “Use it or lose it.”

Before we delve in to research on the matter, consider this (old fogies in the audience). Do you remember a time when you remembered every person’s phone number you knew? It was probably around 1995. You were a human telephone book, speed dialing mere acquaintances as easily as loved ones, without a Rolodex in sight.

Now that your cellphone is your main means of communication, how many numbers do you remember? How many close friends are in your address book instead of your mind?

That’s use it or lose it, or it would be, should we find ourselves unable to remember phone numbers (if we ever again actually tried).

Our long term memory, the way the brain saves its files when they are not actively in use, is considered in most circles to be of a limitless capacity. But if we don’t push our own minds, they will atrophy, not unlike a metaphorical muscle.

Countless studies link general engaging lifestyle habits, like having a challenging job, keeping hobbies, problem solving, social interaction and learning new skills, to one’s cognitive health. Such actions are even associated to the delay and predictability of Alzheimer’s.

So just as lifestyle can affect positive change in cognition—and potentially alter the course of that graph on its own—we must realize that offloading processes, like letting a computer remember things for us, has its own risks.

But if our quest is to balance that graph of cognitive decline to a flatline, we very well may need to bring in the sci fi. What if we invented, say, a neurally connected hard drive to give us some backup?

Theoretically, it would work well for some types of memories better than others.

The Problems With Artificially Encoding Memories

Sensory memories are pretty easy to wrap your head around. The five senses: sight, smell, taste, touch and hearing, can be saved in your brain as memories. For instance, you can probably remember the taste of a McDonad’s cheeseburger without having it on your tongue.

But can we digitize these pieces of our life? Some senses are obviously easier than others. We already understand how to capture sight and sound with incredible detail, however, how do you turn smokey BBQ flavor into ones and zeroes, or the feel of fine leather?

There’s no codec for smell…not yet.

It’s not a great mental leap to consider the challenge of recording more abstract thought, like how nervous you were on your first date—feelings—those things that we can’t really quantify in science beyond adrenal output which have caused artists and Hallmark card writers alike millennia of agony to describe.

If you can suspend your disbelief regarding the practical challenges of encoding memory on a digital scale, however, you arrive at one final type of memory I’d like to highlight in my skim through the human mind—one with a great deal of risk. False memories.

Elizabeth F. Loftus is one of the world’s foremost experts on accuracy of memory. And she’s hated by many. Her research, while highly respected in academic circles, has gotten her harassed, threatened, and sued.

Why? She’s the type of researcher who’s brought in as a witness during sexual abuse trials and offers an argument against the plaintiff because her research has found that what some label as repressed memories can really just be false memories.

But her research has shown, time and time again, that false memories are not at all hard to plant. Heck, you can even do these experiments on yourself. As soon as you start to imagine something you did as a child, it starts to feel a heck of a lot like a memory—at least that’s how things work in my head. (The research backs me up here, too.)

In one Loftus (update: cited) study (PDF), when participants were told by their parents that, back in second grade, they’d gotten in trouble for pouring slime on a teacher’s desk (a fake photo served as further evidence), a whopping 65% of students “remembered” the story two weeks later. In another study, participants were shown Disneyland advertisements with Bugs Bunny in them. 16% of participants recalled personally meeting Bugs at Disneyland growing up…an obvious impossibility.

And in each of these cases, participants weren’t merely remembering something with vague familiarity. They were listing specific details and, sometimes, concocting whole narratives describing the day.

So how does false memory relate to digital memory?

Well, it’s easy to imagine a digital memory system that was so well cataloged that false memories, or even nostalgic, rose-colored glasses memories, couldn’t con their way into the database. But given our natural proclivity for collecting false memories as it is, it seems equally plausible for humans to generate false memories with new-found digital speed and permanence.

An even scarier possibility might be someone else adding false memories to your brain without your knowledge that they were doing so.

As Loftus herself has said,”Even false memories have a lot of sensory details in them.” If false memories were combined with doctored photos and a few faked audio clips, where would our perception of past go? What would memory be in an era of mental Photoshop?

A Bad Time to Be an Early Adopter

At Gizmodo, we’re a like-minded group that’s fairly confident in the potential of technology. We’re futurists. Digital heathens, sure, but optimists, too.

Given evidence that the human brain starts declining in our 20s, most of us would probably reach for a bit of bionic storage or extra processing power if offered the chance. Just keep in mind, research shows that the best things you can do for your own cognition now (and into the near future) are to challenge your mind and exercise your body.

And as for those brain drives…they, too, might contain some Bugs.

[Lead image adapted from Park, DC; Lautenschlager, G; Hedden, T; Davidson, NS; Smith, AD; Smith, PK. (2002). Models of visuospatial and verbal memory across the adult life span. Psychology and Aging, 17: 299-320″]

Memory [Forever] is our week-long consideration of what it really means when our memories, encoded in bits, flow in a million directions, and might truly live forever.

Daily Newsletter

Get the best tech, science, and culture news in your inbox daily.

News from the future, delivered to your present.

Please select your desired newsletters and submit your email to upgrade your inbox.

You May Also Like