We love comfort. We crave stability. We need it to plant roots and grow. However, being too comfortable and too compliant can lead to a false sense of security that this comfort will remain forever. It seems like this might be what’s happening to us now.
Technology plays a large role in this sense of comfort. Access to the best video content or most engaging games for endless consumption has a very low cost. It takes much less effort to get around a city than it did 20 years ago when you had to rely on paper maps, paper money, posted bus schedules, and there wasn’t Uber fetch you. Of course, all of this is good.
However, what’s not good is eating at an all-you-can-eat comfort buffet breakfast, lunch, and dinner. The next breakthrough we’re going to see is how to temper this comfort to keep us strong and human and to not atrophy our judgement.
In my final fourth year of undergrad (it took a few attempts), I had a fantastic prof for a biomedical engineering course. The course was completely different than any other I had taken and I was loving it. I can still hear him provide a definition for life… “Stimulus, response.” That was it. Muscles grow because they’re under a load, new neural connections grow to shorten a path for processing when we strain to think, white blood cells adapt to target new threats. However without stimulus, atrophy happens.
In our effortless existence, we no longer need certain skills. We lose the ability to survive and take care of ourselves.
This is by no means a new concern and the concept that has been explored for a long time in pop culture. Ray Bradbury’s The Veldt from 1950 explores this. Brave New World proposed the idea of the “Violent Passion Surrogate” as a therapy. Horatio McCallister, the fisherman on the Simpsons, suggests an academy to toughen up lobsters that have become too soft.
However, while we might go out into the wilderness and act like Survivorman to develop our physical skills, it’s our communication, inquisitiveness, and healthy skepticism that I’m worried we might lose. With so much information thrown at us, we might create our own mental calluses to block out the noise, or heuristics that end up having us miss out on real meaning.
Would you like to play a game?
Many years ago, I fell in love with Google Reader Play. It was an amazing tool that came out just around the time that Facebook was revamping their News Feed algorithms — early 2010. It was wonderful in that you liked an article and you got presented with more things you’d like. I lost hours every week, each article more fascinating than the next. By 2013, Google shut down the service (it was tied to Google Reader).
The conspiracy theorist in me thinks that Google shut it down because they discovered something so powerful that it could upend human attention and lead to calamity. It was their equivalent of Monty Python’s World’s Funniest Joke or it was akin to one of Nick Bostron’s black balls. It had to be squashed before it destroyed us.
However, Google Reader Play followed a trend of many different services at the time: tailor content for the individual user based on their preferences to keep them engaged on the platform. Add gamification and rewards (posting and liking, building up suspense by delaying the appearance of notifications, etc) to keep users sticky. Get users back to the platform if they go quiet through app notifications or through emails, summarizing all the cool things that they were missing out on.
Along with the notifications came the concept of endless feed. For those kids out there, there was a time when there were a finite amount of posts you could view on your news feed and then you were done… you could go back to work or do more productive activities. When Facebook changed this to an endless feed, it also had to keep people interested. Platforms tied their financial gains to user engagement with paid content, required them to understand and push for likeable material.
This created a singular focus of user eyeballs and clicks over all else, including the users sanity and well being. The system was ripe for taking the user to extremes to keep them interesting. The result was a tunnelling of all of our perspectives, to give us things that we want, rather the opinions and thoughts that we need to be exposed to be healthy.
I’ve gone through several cycles of belief in my life but one of the consistent beliefs is the need to question one’s belief often. It’s one of my high school yearbook quotes. The issue we have with being fed only material we like and agree with is that we stop thinking critically. “Oh, here’s another thing to illustrate my point of view,” we think. We like it, the machine feeds us more things like that. We might even need to be fed things that more agreeable to us because we’ll have adjusted to a new expectation of agreeableness from the material we’re presented.
This might end up influencing us to harden our views. Watch one political video on YouTube and you’re presented with a few more. Watch a few and that’s all you get — and more radical. Soon, like Adam Sandler’s Joining the Cult, “the night time is the right time”.
The problem of only being spoon fed things we like might be taken to a new level with natural language generation or content creation. What if both of us see different versions of this article? What if new content, video and audio, is generated on topics and with points of view that we agree with? Content will be nearly infinite both in volume and variability person to person. We’ll all be watching different movies.
Even if we follow similar political streams, share hobbies and interests, or are born identical twins — we’re different enough at a granular level. The problem with being exposed to adaptive engines is that over time these engines can pinpoint and accentuate these differences. Taken to an extreme, we could all get pulled down our own rabbit holes where we share less and less in common with those around us. I think back to the story of the Tower of Babel where as punishment, those building it could no longer communicate with each other and each spoke their own language. Further dystopian, it’s like we’d be plugging ourselves into The Matrix but everyone would have their own version.
Taking the Red Pill
How can we prevent ourselves from going down this rabbit hole? Maybe we need our own technology and AI to prevent being overwhelmed by technology trying to manipulate us? Or maybe we change our consumption of content the same way we treat food?
Most of the time, we should be enjoying healthy “home cooking” — interacting with REAL people, trying to provide them with value and meaning. Like eating out, we can often go out for content, enjoying a curated experience. Much more rarely should we engage in the “all you can eat buffet” of endless feeds.
In terms of using technology to counter undue influence from platforms, maybe tools that look at our communication can signal to us (and us alone… no sharing) that we’re starting to go off the deep end. It would need to be owned by us with no conflict of interest from revenue for sharing our data and completely private.
Maybe this way the LOLs can do what they were supposed to do… bring us a bit of happiness to help us be better to each other.