Image by Freepik
“We do not grow older, we grow riper.” — Pablo Picasso
Historical and social forces shape how we think—and how well we think. The long history documenting human intelligence contains curious patterns. Take the case of the Flynn effect, the observed rise in standardized intelligence test scores over the twentieth century. The gains are peculiar because they reflect not our accumulated knowledge (crystallized intelligence), as we might think, but rather they indicate a rise in our problem-solving ability and nimbleness in thinking (fluid intelligence). You can find results of ten early studies here.
There are plenty of explanations for the Flynn effect, such as richer and more stimulating environments (think of the popularity of TV and film), better nutrition, and improved test-taking abilities (currently many school systems gear teaching towards test-taking). It’s also very possible that the results are simply due to measurement artifacts. Yet the trend has held: on average, adults born later score higher than those born earlier.
At the same time, in some countries-- like in Scandinavia, the UK and Australia--a reverse Flynn effect has emerged, where scores have leveled off or slipped back. Some proposed theories include a lower quality of education, or the increased time on screens which can lead our attention to become fragmented (my own research documents this). The bottom line is that no one really knows what causes the Flynn or reverse Flynn effects.
Meanwhile another trend is emerging. Over the last 30 years, some studies show decreasing rates of cognitive decline and dementia. In the U.S., where nearly 20% of the population is over 65, this is a trend to watch. Some explain the decline due to greater awareness of exercise and a healthy diet, and improved treatment of hypertension and cardiovascular disease--all of which can affect dementia.
And now there’s a surprising twist: some attribute this to technology use. A meta-analysis—a study which combines the results of multiple studies, in this case 54 --suggests that digital technology use might be linked to better cognitive outcomes in middle-aged and older adults. The authors focused on “digital pioneers” --people who first encountered digital technologies as adults—as distinct from digital natives who were born into it.
Six of the studies in the meta-analysis were longitudinal. Most studies in their sample controlled for socioeconomic factors, health, social support, educational level, occupation, and lifestyle—the things we expect might affect cognitive ability or impairment.
While this meta-analysis presents an optimistic picture that people who use technology might slow their cognitive decline (and I’m sure tech companies are loving the result), we need to be cautious. I would love to embrace the results. However, before you view this as a free pass to jump onto your device and spend your days on the Internet to prepare yourself for your golden years, consider the following.
Correlation does not imply causality. There could be any number of underlying commonalities that exist with middle-aged/older people who both retain their cognitive abilities and use technology.
First, the participants in these studies have been using technology for some time, and in many cases, long enough to be early adopters. it is well known that people with certain personality characteristics are more likely to adopt technology, and to even be early adopters. These characteristics include risk-taking, being curious, open-minded, and having greater self-efficacy. In other words, it’s very possible that the people in these studies who use technology might also have personality traits that might protect them from decline.
Second, it depends on what people actually do with technology. Technology is neutral—it’s neither good nor bad, and consequently, we can use it to harm or benefit ourselves. Consider the following difference. Reading Pride and Prejudice online on an e-reader is very different than endlessly watching TikTok videos. Except for three studies in the meta-analysis, it was not reported what people did when using technology “in the wild.” These three studies looked at social media use and found mixed results. My suspicion—though I’m an optimistic—is that those who benefit most from technology use it judiciously.
Third, perhaps the technology was used as a scaffolding by people in the studies, as the authors acknowledge, and that is what improved their cognitive function. For example, if someone has trouble with memory, then reminders, notes and timers on the smartphone can help, which serve as an external memory bank. GPS can help people who have weakened spatial memory. In this way, it can act like a prosthetic, compensating for the slow erosion of certain abilities.
And finally, technology could help promote better cognitive function in older adults through facilitating better social connection. Studies of superagers show that what they share in common are strong social connections. A rich form of media like video can help maintain social relations. With less rich media like texting—less so.
But cognitive reserve theory reminds us that reliable protection from cognitive decline comes from sustained mental challenges. Reading a novel, learning a language, playing chess—these activities demand more than recognition or reflex; they require sustained engagement. They build new neural pathways, engaging fresh neurons. And so, the fact may be that it’s not technology per se that enhances cognitive function, but any challenging mental activity can keep the mind robust.
Physical exercise matters too. There is mounting research that suggests that exercise can play a significant role in preventing cognitive decline. Here technology like wearables can promote healthy behaviors like walking or quality sleep. (I am an avid user of wearables.) Again, the key is not the technology itself, but how we choose to use it.
In the end, the numbers that map out our mental lives over time—the Flynn effect, the reverse Flynn effect, the decline in dementia rates—each of these curves reflects the interplay of culture, education, lifestyle, and perhaps technology too, interacting in ways we’re still struggling to understand. Perhaps what the data really reveal is the mind’s remarkable adaptability to the world it inhabits.
The problem lies not just in the devices themselves but in the temptation to let them think for us. We can instead consider how they might spur us to think differently—more socially, expansively, and imaginatively. In this sense, the future trend in our cognitive abilities will not be written by silicon alone, but by our enduring human ability to shape, and be shaped by, the forces of culture, education and lifestyle.
**********************************
If you’d like to learn more how you can improve your attention, you’ll find it in my book Attention Span, which covers my research. It is now in paperback with new exercises.