It’s July 14, 2041. You wake with the gradual brightening of your bedroom lights, the shower already running at your preferred temperature. As you lather, you recount your dreams to your ButlerBot, the sophisticated AI that runs your smart home. It responds through the bathroom speakers with trenchant analysis consisting of Freudian factoids gleaned from Google. Once dressed, you use your brain implants to summon your commuter drone, and your ButlerBot hands you a packed lunch as you step into the drone’s mood-lit passenger pod.
Your ears pop as you are sucked skyward into a roiling, seething city-bound swarm of similar drones, each flight path controlled remotely by quantum computer. Noise-canceling speakers insulate you from the furious metallic buzz that surrounds you until the swarm spits you out and your drone glides down to your office basement. You step onto a conveyor belt that ribbons you up to your floor. Your firm’s ButlerBot greets you with a cappuccino, upon whose foam your initials are delicately traced.
Set aside for now the potential downsides of a twice-daily locust plague of flying robots (the hellish din, the blocking out of the sun, and so on), worries about where all this energy is coming from, and any qualms regarding the implants’ potential to permit remote access to your brain. Apart from all that, sounds great, right? No more sweaty commutes spent with your face pressed into some stranger’s fruity armpit. No more interminable waits in coffee shops clogged with strollers, from which howling children launch desperate escape bids, arching as though defibrillated.
And given how good your ButlerBot is looking lately in his new physical body (you inwardly congratulate yourself on choosing the Lumberjack kit—so manly); how funny, smart, and emotionally available he is; and the fact that he’s clearly fascinated with you, maybe it’s also a world without heartbreak.
Everything runs smoothly in the Frictionless Future. But would you want to live there?
From Mining to Meaning
If you use digital devices, AI is already being sicced on the grotesque bricolage that is your life to eliminate potential sources of “friction” — a tech-speak jargon term that means roughly “whatever grinds your gears.”
Whether it’s by monitoring your calorie intake, presenting you with “optimal” romantic prospects, or making it easier to spend a fortune on Amazon, algorithms are even now insinuating themselves into your every existential crack and crevice like so many squirts of WD-40.
The possibility of using AI to eliminate diseases is undeniably exciting. Excising problems like cancer from society would make for a better future. But is maximizing efficiency the only way to add value to the world?
Most people don’t see the world and its inhabitants simply as a resource to be mined more or less effectively, nor do we tend to think that human value is exhausted by the efficiency or otherwise of this resource mining. Sometimes we just want to make sense of things: to look closely at the world, grasp some pattern in it, and articulate its significance, without some further goal in mind. This desire is what drives people to become scholars, but it’s also why people look at art, listen to music, or strive to build relationships with their grandchildren. If a concern for efficiency is a big part of what makes us human, our desire to grasp significance and share meaningful experiences with others is just as crucial.
Like a world without cancer, a more thoughtful, artistic, and compassionate future strikes us as an unequivocal Good Thing. But adding this kind of value to the world requires something more than maximizing efficiency. Anyone who tries to “hack” being a thoughtful scholar, or a good friend, is kind of missing the point. And becoming a better lover is hardly a matter of doing more, in less time, with less effort.
Meaning in a World of Clickbait
The oldest visible marks of our interest in communal sense-making, as opposed to efficiency-maximizing, date from around 39,000 BCE, when a group of our ancestors stenciled their hands on a cave wall in what is modern-day Spain. Our desire to understand the world is deep and abiding. But satisfying that desire takes effort, and cheap thrills abound.
Right-here-right-now desires are the currency of today’s digital economy. Having figured out what sort of content you find irresistible, algorithms keep giving it to you, and the entities whose content it is pay the algorithm purveyors handsomely for directing you to their wares. The constant lure of “clicky” content is already making it harder to focus on the pursuit of meaning, whether that pursuit involves learning calculus, or writing poems, or simply paying attention to your dinner companion. And unless today’s tech moguls decide to overturn the business model that made them rich, it’s only going to get worse.
Of course, digital technology gives us unprecedented access to products of understanding — art, music, scholarship — which also serve as raw materials in our ongoing sense-making activities. But this accessibility comes at a price.
It’s more efficient to stream a live concert or Google a painting than to go see it for yourself: You save money, you avoid a commute, you can stay at work for longer. Future AI is likely to create options like this in an ever-expanding range of contexts. The problem is that the accumulation of decisions to stay in the office or on the couch will inevitably result in the disappearance of the public spaces whose existence fueled the creation of those things in the first place.
Scholars and artists often grapple with the wider world in seclusion from it. But making sense of the human world requires apprehending it first, and public spaces are the prime arenas in which that happens. The sneer on one face, the delight on another; the wailing child ignored by an aloof sibling—the ways in which the contours of the human world radiate outward from our own familiar corners of it can make themselves felt in these spontaneous apprehensions, in the recognizable strangeness of the conduct of others. We are all forged and shaped in encounters like this — like pebbles in a bag, we polish one another, as Leonard Cohen put it once. Take away the bag, and the possibility of polishing vanishes with it.
There’s no principled reason why AI developers should rest content with the satisfaction of our most transient desires. Imagine if the immense ingenuity of Silicon Valley were instead channeled to support focused, sustained, and shared attention — algorithms that would nudge us toward the hard human work that we already want to do, instead of pulling us away from it. That sort of frictionless future would, I think, be something worth having.