Five risks of news personalization

Here are the most dangerous ethical traps of personalized news service that media organizations should be aware of. And why a tailored approach for individuals is still necessary.

Titus Plattner
JSK Class of 2018
8 min readJun 12, 2018

--

Personalization has proven its effectiveness in increasing user engagement, and news outlets are starting to explore this opportunity. I showed the bigger picture of this possible golden age here and listed 10 actionable points here. But due to its novelty, where solutions and best practices remain to be invented, lots of errors are made… and, unfortunately, ethical questions are often left for later. Here’s a list of serious ethical risks that pioneers should (really) care about.

Will different groups of society lose any common reference point? Three student cohorts at the Stanford d.school. (Photo: Titus Plattner)

1. Reinforcement of filter bubbles

The idea that algorithms, might reinforce the echo-chamber effect, particularly for the distribution of news on social media, is now a wide-spread public discussion. Bias bubbles, filter bubbles, information blindness, preference bubbles, are all different ways to describe the same phenomenon: people get what they want, but are less exposed to other opinions and less and less confronted with facts they’re not interested in.

By using the same click-driven recommendation engines on their platforms, media outlets could reproduce this toxic atmosphere, just on a smaller scale than social media or news aggregators.

This risk shouldn’t go unheeded. But unlike Facebook, Twitter or aggregators like Google News and Nuzzel, media organizations have a deep understanding of the content they produce. Most news organizations aim to deliver a balanced representation of the truth. For media with a strong fact-based journalism culture, there is only a residual risk that they could produce and spread misinformation. Nevertheless, the constant questioning of the fairness and accuracy of their own reporting, and a clear, transparent and humble attitude towards critiques, are even more important in this new world.

That is, for general news. The impact of personalization becomes trickier for opinion pieces or analysis. In these fields, there is a reasonable risk that users would only get points of view that conform to what they already think. Worse, they might be driven to even more extreme opinions. I see two serious objections to this concern.

First, filter bubbles have always existed. For example, my 81-year-old aunt, who has strong liberal values, is not on social media. But she has read The New York Times every day for more than 40 years and watches MSNBC. She wouldn’t read The Wall Street Journal, The Economist or The Washington Post… and if she knew how to do it, she would certainly delete Fox News from her TV channel list. She lives in a comfy liberal filter bubble.

Second, media organizations are now able to correctly label opinion and analysis pieces, and some have agreed on a new machine readable standard, led by the Trust Project. Rather than reinforcing their bubbles, understanding readers’ individual preferences could be an opportunity to create new ways to explore opinions in the news. For example, by letting the reader flip from a liberal point of view to a conservative point of view.

2. Closing people into a tiny box

Narrowly tailoring the feed on people’s individual preferences would imprison them in a tiny space, with no exposure to what’s outside.

The joy of a surprise, an unexpected discovery, has always been part of the media experience. In a linear consumption mode — when people went from page 1 to 64 of a newspaper or magazine, or from minute 1 to 26 of a news show, the audience was inevitably exposed to many unexpected pieces of content. In the digital world, especially in a personalized setting, it is important to keep flexible boundaries in the curation process, in order to let what’s outside of each user’s personal preferences nourish the whole experience.

“Don’t over-personalize”, said Bethany Ostecchini, director of Time Inc. UK’s beauty website, This is Powder, in an interview with Reuters Community. “It is very tempting to stretch the system to its limits. For example, I would only see content with a 30-something woman with very dark hair, green eyes and medium skin. But actually, I would like to see women of all ethnicities. Sometimes, I want to see products that may be outside of my budget or under the budget that I decide. It’s about aspiration. Getting that right is really important.”

This spirit of maintaining a certain porosity should be applied to news apps. Their interfaces should also keep space for spontaneous, linear discoveries.The simplest version of that would be a simple timeline compiling the most recent news pieces.

3. When what interests the public overruns the public interest

Can algorithms detect what is in the public interest? Or are they only good at finding what interests the public? Up until now, most recommendation engines have been totally incapable of sorting news by importance. But the crowd is not always the wisest editor. To correct this effect, some recommendation engines lean on the original position in the feed — if something was manually placed at the top of the homepage, it is flagged as “important”. But that’s it.

Human editors, on the contrary, are trained to quickly weigh the relevance of a constant flow of news. This is not an exact science and what is not super relevant for one outlet, could be very important for another one. These nuances, depending on context, are precisely what make it so hard to automate.

Controversial content often has a higher share rate or bigger number of comments. That’s why clicks, shares and reading time shouldn’t be the only variables for algorithms to rate news pieces. To prevent the risk that the interests of the public might overrun the public interest, the next generation of recommendations should integer human crafted metadata about the importance of a piece, it’s longevity (“ever-greenness”), and its geographic relevance, etc. This could be generated on the content production side. News organizations could add this information for a reasonably low cost by using their existing production channel. They could use this metadata to competitive advantage, to improve the user experience on their own platforms. Third party news aggregators would not get this metadata, and could never reach the same level of relevance.

Metadata about the quality of the content could also be crowdsourced from the audience side, with specific questions. At the end of each article, the Swiss news organization Tamedia, which I work for, asks a simple question: Was this article worth reading? Other questions like, “Do you feel well informed?”, “Is this important to know?”, “Do you feel happier?”, “Is this inspiring?” would probably also be good questions in certain contexts. The quality of the conversation that an article drives could eventually be measured.

4. Disappearance of common reference points

With the exponential multiplication of TV channels and the explosion of linear TV in the era of streamed video content, television has become a much less common experience across society. Print news was always more stratified, but it seemed to be more stable in the digital world. We could all still read the same front page article and talk about it.

Now, what happens if, because of personalization, I see a totally different set of content than my neighbor or my friends? How could we have common ground for a conversation? If people have similar demographics and reading behavior, they’ll probably get mostly similar content. But what if they are quite different?

Worse, what if the content itself is also personalized? How could somebody even share this content? In a Stanford class, with an interdisciplinary group of students, we imagined a solution to tackle that challenge. Each different version of an article would have a unique identifier, and there would be transparency about the fact that the article is just one of many versions.

In a scenario of a legal issue about an article, this raises many unanswered questions. What is the liability of a publisher if one of many versions contains a defamatory paragraph? How would a judge assess the damage? Or imagine a situation where the system would show an uncensored version of an investigative piece to anybody, except to the lawyers of the accused person, who would get a legally bullet-proof version. This would of course be totally unfair.

If an error were made, who should see the erratum or the right of reply? Only those who clicked on the original article? Or everyone?

To prepare themselves for all the situations that will inevitably occur, media organizations should build a system where transparency and traceability are key. Wikipedia has proven that it is possible to track every single version of a text.

5. Loss of privacy in media consumption

On the other hand, the user’s privacy must be guaranteed. The European Union’s General Data Protection Regulation (GDPR) requires clear consent and justification for any personal data collected from users. The way news organizations explain and give their customers access to the data they store about them should be exemplary. A user dashboard should also give the user full control of what is stored.

In some situations, the simple act of reading a particular article can be a threat. If someone reads the same article of an unsolved murder many times, and the police learn this from the data, would he become a suspect? Should media organizations cooperate with the authorities? And what about political activism?

In order to keep the user’s trust, as little personal data as possible should be stored. These issues could be solved by establishing differential privacy, or the ability to delete, or randomized particular entries without any logs.

As of May 2018, Google News’ experience on mobile devices now begins with a “For You” section that contains a briefing on the five top stories the algorithm has pulled for you. It is supposed to be a mix of global headlines, local news, and new developments on stories you’ve been following. The New York Times, The Financial Times and The Washington Post, among others, have started to experiment with personalization. Apple News is also very personalized. The trend seems to be unstoppable. Larger news organizations have good cards in hand. But those who want to get their golden nuggets out of this new land should anticipate the ethical challenges that come with it, instead of risking a backlash.

After one year exploring news personalization as a John S. Knight Journalism Fellow at Stanford, I am still convinced that this is a huge opportunity for the news business. But in order to protect its long term interests, it must be done responsibly. Those who would sacrifice the highest ethical standards would put in danger their most important asset: the trust of their audience.

Note:

In a previous article, I wrote an introduction about Why personalization will be the next revolution in the news industry. And in another one, I described 10 effective ways to personalize news platforms.

I will continue to work on this subject and on many others at Tamedia, Switzerland. I am happy to hear your point of view and start a conversation. I’m open to collaborations.

Please write me at: titus.plattner@stanford.edu

--

--

Titus Plattner
JSK Class of 2018

Innovation projects & investigative reporter @Tamedia l @JSKstanford 18' l ex-@ICIJorg Network Committee l Board of Swiss FOIA l https://keybase.io/titus