The Real Concern About AI: Echo Chambers and Complacency
Due to science fiction, current terminology, misunderstandings of what generative AI is, and resistance to disruptive technologies, there are a lot of fears and punitive opinions surrounding AI. At the most extreme, some might view ChatGPT as Skynet’s great-grandfather, leading to a Terminator-like dystopian future where AI wipes out humanity. These fears have even prompted Congress to hold hearings on managing AI.
On the other hand, industries like publishing are vehemently opposed to using AI, with punitive actions such as rejection and banning if AI is employed at any step in the process. They fail to realize that even a simple Google search to locate information for a story uses AI, as Google’s search algorithms are AI-driven.
AI is also a disruptive technology. Just as video killed the radio star, Netflix replaced Blockbuster, and the buggy whip driver was replaced by the automobile chauffeur, AI will bring changes to the labor and business markets, which can be intimidating and scary to those who resist change.
I tend to view most of these opinions as hyperbolic at best. They are typically grounded in misconceptions of what AI is and what it can do, rather than reality. That said, there is one concern I have about AI. It’s a concern I recognize in my daily life even as I opt-in to various data collection services that enhance suggestions and outputs from AI-powered programs. Despite this, it is worth discussing.
Here’s my concern:
We already have echo chambers that all of us live within. Most people consume content based upon their religious and political world view, as well as their interests in specific sports, music genres, or hobbies, which has a significant impact on the type of content they encounter. If you decide to listen to Flatt & Scruggs on Sportify, the algorithm is highly unlikely to suggest “Volare” played by the Gypsie Kings. If you’re into crochet, the YouTube AI algorithm is unlikely to suggest videos for leatherworking. My point is that all of us, whether intentionally or subconsciously, already live in echo chambers.
Building on the echo chambers we live in, consider the data aggregation we’ve been participating in for nearly three decades now. Those grocery loyalty cards we all signed up for back in the ’90s have been used by companies like Kroger’s 84.51 to collect data on our purchasing habits. On one hand, this is mutually beneficial: Kroger’s algorithm can recognize what items its shoppers like to buy and offer marketing incentives via digital coupons in their app, which has the potential of expediting the next trip to the store, generating additional sales for the company. The amount of data 84.51 has collected over the past three decades has allowed them to aggregate consumer behavior to maximize everything from the number of employees on the floor, to the actual amount of cash they need in the registers on any given day, to what and how much of each product should be on the shelves.
The true risk of AI lies in the intersection between our echo chambers and data aggregation.
Imagine this scenario: your phone logs that you typically leave the place you’ve marked as “Work” every day for about an hour between 2 and 3. Given enough time, your phone, using its Google/Apple algorithm, will eventually have the capacity to anticipate when you’re going to leave for lunch, examine how busy the traffic is, and offer up suggestions on where to eat based upon your typical schedule. While this is absolutely a handy resource and one that I’ll gladly implement, it also means that your decisions for the day are being cultivated by AI. While the final decision on where you eat may still be yours, the AI may have also left out suggestions you might’ve considered, leaving you ignorant of those possibilities.
Similarly, YouTube’s algorithm uses AI to help cultivate content that it thinks you’d be interested in watching. As it gets more advanced, it’ll be able to anticipate seasonal and personal changes. Now that Star Wars: Bad Batch has ended ends and Doctor Who Series 1 has started, YouTube has stopped suggesting reviews of Star Wars and start suggesting reviews of Doctor Who. Once Doctor Who concludes and Star Trek: Strange New Worlds kicks off Season 3, the algorithm will update again, and cultivate different suggestions based on which show is streaming at that time. But, AI is not only cultivating a list of suggested content based upon our interests, but also on our echo chamber. For example, if you’re a Star Trek fan who doesn’t see issues with the current set of shows, you won’t get Nerdrotic as a recommended channel because Nerdrotic tends to view New Trek as incredibly “woke” and presents a negative tone on his channel, while TrekCulture is very positive about the current shows. Thus, even the content cultivated about your interest is further segmented based on your willingness to engage with it.
These two facts have the potential to create a perfect storm where we become so isolated within our own echo chambers that we won’t be exposed to things that diverge from our own perspectives. In short, the real concern of AI is complacency. We will become so dependent upon it to troubleshoot and offer suggestions to problems we haven’t even recognized we have that our ability to survive and cope without it may become severely maladapted.
Despite all the hyperbole about AI and the machines rebelling, the real risk of AI is our dependence upon it. It has been subtly incorporating itself into our lives for decades in one facet or another. JPMorgan Chase began using AI and blockchain about 10 to 15 years ago to attempt to generate profits via arbitrage trading. Our playlists, books on Kindle, and everything else are already tailored to our desires, and these marketing outputs are only going to get more precise. If AI really wanted to disrupt the world, all it would have to do is stop helping. Just as younger Millennials, Gen Z, and Gen Alpha wonder how we got around before GPS, we will all eventually reach a point where we’ve grown so accustomed to our decisions being assisted and managed by AI that acting without it will create a significant disruption.