TECHNOLOGY: THE MIND IN AN AGE OF FLUX
MAGGIE JACKSON ’78, a renowned journalist and social critic, wants people to understand how digital technology threatens the crucial work of focusing well and thinking deeply. In the introduction to the second edition of Distracted: Reclaiming Our Focus in a World of Lost Attention, published in 2018, Jackson writes: “The way we live our lives is eroding our capacity for deep, sustained, perceptive attention.” Last year, Distracted received the Media Ecology Association’s Dorothy Lee Award for Outstanding Writing on Technology and Culture. Jackson’s new book-in-progress explores the critical role that uncertainty plays in higher-order cognition. Milton Magazine recently spoke to Jackson about her hopes and concerns for the human mind in an age of speed, flux, and ever-more-powerful technologies.
IN DISTRACTED, YOU DESCRIBE HOW ADVANCES IN DIGITAL TECHNOLOGY ARE THREATENING OUR CAPACITY FOR DEPTH OF THOUGHT. WHAT CONCERNS YOU THE MOST?
The internet and ai revolutions are reshaping how we work, play, learn, relate, and think. Among these radical changes, what concerns me most is the fate of the latter skill. Today I see an increasing unwillingness and even inability to think deeply. Our devices immerse us in milieus marked by speed, brevity, and info-bites, realms where ideas seem downloadable and wisdom seems attainable at a click, vending- style. Studies show that only a quarter of online posts are actually opened or read before they’re shared, liked, or retweeted. Another body of research shows that people are less willing to struggle with a problem after just a brief bout of online searching. In essence, technology largely speaks to one side of our cognitive selves: the heuristic, short-cut gut thinking, such as stereotyping or learned-pattern recognition, that occurs quickly and often unconsciously. What gets sidelined today are slower, messier modes of critical thinking and creativity, the types of mental operations that don’t happen with a click. How do we discern what’s relevant amidst conflicting evidence? How do we find the insights hidden within a muddy problem? These are some of the aspects of mind that are under siege today.
DO YOU THINK MOST OF US ARE EVEN AWARE OF WHAT’S BEEN LOST IN OUR ENTHUSIASM FOR WHAT TECHNOLOGY OFFERS?
The national conversation around attention and technology has slowly matured beyond the overly simplistic “Luddite” vs. “tech-booster” divide—and that’s truly heartening. Now I hear people of all ages questioning technology’s effects on their lives. A couple of years ago, I spoke to a group of university students who were very concerned about their young nieces’ and nephews’ screen time. These were students who had grown up with technology themselves and yet they shared older generations’ ambivalence about our deepening dependence on the virtual, the digital, and the algorithmic. It’s important that we continue to nurture a nuanced and judicious sense of tech-skepticism so that these inventions aren’t accepted mindlessly into our society and our lives.
IN SO MANY WAYS TECHNOLOGY HAS BEEN A GODSEND DURING THE PAST YEAR, AS WE’VE BEEN FORCED TO LIVE APART FROM ONE ANOTHER. AS WE’VE BECOME MORE ADEPT AT LIVING VIRTUALLY, WHAT DOES THIS MEAN FOR THE CONCERNS YOU RAISE?
The pandemic is truly a time of “invention springing from necessity.” We are living much of our lives on screens, often in new and creative ways. My neighbor had a holiday cookie baking party on Zoom with relatives scattered across the country. Yet the increased and even innovative use of technology doesn’t necessarily translate into fully understanding what we are gaining and losing in this new age. The scholar Walter Ong once wrote, “To know something as fully as possible we need to be close to it and … at the same time distanced from it.” That’s so true of any technology— be it a book, a smartphone, or a robot pet. Any technology can be a vehicle (for gaining information or long-distance togetherness) or an impediment (to being present for others or focused problem-solving) as well as a hidden influence. Consider how Google answers your search question before you’ve fully asked it. To understand our devices on all these levels, it’s crucial to master their use while remaining a “tourist” in the technological world, keeping a “beginner’s mind” about this realm. That way we can turn our nascent concerns about technology into true skill in using our devices wisely. We can learn to continually ask ourselves, “Is this device bringing us together or getting in between us? Is this app enabling me to deepen my knowledge or keep-ing me sated with surface answers?”
HOW CAN WE TAKE ADVANTAGE OF ALL THE WONDERS THAT TECHNOLOGY HAS TO OFFER WHILE AT THE SAME TIME LIVING THOUGHTFUL AND MEANINGFUL LIVES?
As I mentioned, we need to become tech skeptics who continually take stock of digital life from multiple perspectives in order to use these extraordinary inventions more wisely. I’d suggest two further ways to pursue this crucial mission. First, use boundaries of time, space, and the mind. Stepping back completely from technology at times allows us to curate our environment, taming the fractiousness of the digital world and restoring what I call the integrity of the moment. Some ceos and other leaders, for example, have experimented with limiting the use of devices in meetings. Even online, we can use boundary-making in order to foster depth of thought. In other words, open just one window, not six, while you’re working. Or turn off the constant interruption of notifications for a time. Studies show that people who multitask a lot have poorer memories and are more distractible than those who tend to focus on one or two things at a time throughout the day. When we draw clear lines between “this” and “that” or “then” and “now,” we are setting the stage for the attentional skill of focus, the most important boundary-making of all.
Another way in which we can take advantage of technology while cultivating skills of reflection and attention is by dropping the outdated metaphor of the mind as a computer. Our language is suffused with the idea that we are “programmed” to do this or “hardwired” to do that. For example, memory is seen as a kind of system of file folders that you click in and out of rather than the organic, evolving entity that it is. The mind in fact creates knowledge slowly by abstracting, synthesizing, and sifting information. If you put down your device and struggle to remember that restaurant where you ate last summer, or the forgotten name of a painter, you’re actually strengthening these astonishing processes of knowledge-building. Outsourcing memory is just one way that we often treat the human as a kind of deficient machine, and so lose chances to cultivate our species’ strengths of discernment, adaptability, empathy, and rationality. For all our frailties and errors, I’m not willing to give up on the human mind.
IN THE BOOK YOU ARE CURRENTLY WRITING, YOU EXAMINE THE ROLE UNCERTAINTY PLAYS IN DEEPER THINKING. CAN YOU SAY A LITTLE ABOUT THAT?
Everyone knows what it feels like to be unsure and how unsettling that state of mind is. We constantly have lamented “these uncertain times” this past year. But until recently, little scientific attention was paid to the mechanisms and importance of psychological or epistemic uncertainty, which is defined as the awareness of the limits of your knowledge. It was treated as a kind of cognitive no-man’s-land, little understood by researchers and laypeople alike. In this regard, uncertainty is like attention, which was a scientific mystery until a few decades ago.
Now, however, there is an explosion of new discoveries showing uncertainty’s role in reflection, flexibility, creativity, and even well-being. Uncertainty is akin to a mental gadfly, pushing us to think more deeply, to investigate, and to keep an open mind. “Not knowing” allows us to drop our assumptions about people whose politics we loathe and equips us to realize when we need to study harder or take a second look. It offers us the mental space to engage in creative reverie or to test a rough hypothesis.
In my new book, I cite a revealing study about European ceos who faced a huge European Union expansion, sort of the opposite of Brexit. When researchers surveyed the ceos in advance of the change, in 2004, some predicted that the larger marketplace would be great for their firms. Others said the new competitive climate would threaten their businesses. To the researchers’ surprise, however, a third group— contrary to the ideal of what a CEO should be—admitted to being unsure about what the expansion might bring. And lo and behold, it was this third group that a year later proved to be most adaptive during the crisis. They considered a wider range of options, included diverse voices in their decision-making, and showed more resourcefulness in their responses. Their ambivalence inspired actions better calibrated to the situation. In contrast, those who were most sure of the way stuck to tried-and-true measures or sometimes did nothing at all. I see uncertainty, along with attention and reflection, as one of the foremost pillars of human wisdom, mutual understanding, and creativity. And yet too often, in realms from medicine to business, activism to education, and even just around the dinner table, being unsure is seen as weakness, not wisdom—as something to run from or eradicate at a snap. We can’t move out of this dark time unless we push back on the dominant ideal in our culture that knowledge is quick-fix, push-button, and certain. And I think we are on the cusp of doing just that.
INTERVIEW BY SARAH ABRAMS
Photograph by Jason Grow