Why Personalized Feeds Will Deepen Our Divide
Last week, a friend introduced me to Medium. Although I never regarded myself as an avid reader, that night I couldn’t stop reading! Each article was more engaging than the last. After loading my brain up with a dozen articles or so, I hit the 2 a.m. wall and went to bed. The next morning on my bus ride to work, I opened up the Medium app on my phone. Article recommendations flooded my feed, begging me to continue the binge. As far as Medium could tell, my world revolved around GraphQL and Russian hackers.
They weren’t wrong. And that scared me.
It’s not that the tracking surprised me. I know how valuable user data is to a product like Medium. What scared me, actually, was the bubble of content that so quickly encapsulated me. In just a few hours of activity, they could pinpoint exactly where I fit in their whole ecosystem.
Think of all the sites that do this kind of data-driven personalization of feeds — YouTube, Facebook, Twitter, and Netflix to name a few. The personalization is at the core of why we love using these apps, but why should we be scared?
This made me recall a VICE episode (Season 5, Episode 28) I recently watched that discussed the current political divide in our country. In the episode, I was struck by a particularly eye-opening visualization taken from the Electome project at the MIT Media Lab.
This is a 3D visualization of how Twitter users were connected in the so-called “Twittersphere” at the time of the 2016 election. The divide between Clinton and Trump supporters was clear. Ironically, instead of a single bubble surrounding the entire universe as the early social network pioneers may have envisioned, there were two-ish clusters. This might explain why so many of us were shocked by the outcome of our recent presidential election. We lacked the awareness of people outside our social network.
This brings me back to the topic of personalized feeds. A recommendations engine is also a network, or more appropriately, a graph. Imagine day 1 of your Medium membership. You enter the “Mediumsphere” as a single node with no connections. After reading 4 articles, you build 4 connections from your user node to the articles, which are nodes themselves. And if other users have read the same articles as you, they have connections to those articles too. These users may be connected to articles of their own, which would be recommended to you. This would be something like a Customers Also Viewed section of an e-commerce site. Not only would you be affected by what they were reading, but they would be affected by what you were. If that trend continues, it’s not far-fetched to imagine clusters in the graph, just like the ones from the presidential election.
Now, I’m not saying this is all bad. No doubt I prefer to consume content suggested by likeminded individuals. That’s what makes Medium and others such enjoyable products. But the problem isn’t that we live in our own bubbles. The problem is that we’re not always aware of it.
So what can we do?
Let’s first establish some ground rules.
- People will only consume what they want to consume. If you think you can force people to consume content for the sake of offering different perspectives, they’ll lose interest.
- People are stubborn and prideful. It’s hard to change someone’s opinion by telling them they are wrong. They need to come to that decision on their own.
- People are different, and will continue to be different. We shouldn’t expect a consensus on anything.
These assumptions make a solution hard to craft. A reminder of what we’re trying to solve — we don’t want to burst the bubble, we want to be aware of it. If we are aware, we can empathize with what’s outside of it. And if we can empathize, maybe our bubbles will start clustering back towards each other. Knowing this, here is what I propose.
Content needs to be provided with more consumer context. A content’s author can be easily identified but how about its consumers? Who are they? Do they agree or disagree? Are they left or right leaning? What is there background? What other content do they prefer? All of that additional context, visible on our screens, would help to expose the bias inherent in every opinion piece. The point is not to label content as good or bad, rather it is to remind the consumer that it’s not the only content.
The subtlety of this change is debatable. Whether this would turn off viewers is something I am uncertain of. And whether this solution would even change anything is questionable. Consider organizations like MSNBC and Fox News, both of which are known to attract a particular side of the political spectrum. Viewers aren’t exactly browsing both sources just to “see it from the other side”.
But maybe it’s not MSNBC and Fox News who need to change. Actually, it’s those who claim not to represent a side that need to. Because content on these nonpartisan platforms is going to be perceived as that, nonpartisan. And we know that’s not a realistic expectation!
Whether you think this is the fault of the people or the content providers, I say who cares. To solve this issue, the content providers must make the first move. If they continue to pigeonhole us in personalized feeds, we’ll form denser, more distant bubbles without realizing it, or worse, without caring. Especially as AI advances, the bubble will influence more aspects of our lives. And if everything we’re exposed to is data-driven and catered, have we even considered what that means for human inspiration and free-thinking?
Intended to or not, Medium’s own name suggests that it is centered and nonpartisan. However, without proper context, its content is anything but medium. If we agree that there is no eradicating bias in opinion pieces like this, the least we can do is expose it.
Thank you for taking the time to read this post. I couldn’t have written this without having some great conversations with smart people, so thank you to them. If you have advice on how to improve my writing, please send me a private message. I’d really love to hear your feedback!