Is the endless dissemination of news media, at our fingertips 24/7, really democratizing our access to information?
This post has been directly informed by Professor Julia Cagé and her “Future of Media” course at Sciences Po University.
Even if we set aside scandals such as Cambridge Analytica or Russian interference in the latest US federal election, the current day to day reality of the way news is disseminated and consumed on social media is troublesome. As digital technology continues to elude policy makers’ grasp across different sectors, tech giants are not regulated as traditional news broadcasters are. But since most people in the US consume their news via social media, this deserves to be reconsidered. Some mechanisms through which new technologies are influencing the freedom of the press include the creation of filter bubbles; the increasing political polarization of social media; and the dissemination of fake news. However, on the other hand, new technologies such as blockchain may help address the news media’s longstanding and emerging challenges.
Profit driven algorithms employed by tech giants are swallowing the freedom of the press, and they have already rapidly and fundamentally transformed news media and journalism. According to Emily Bell, “our news ecosystem has changed more dramatically in the past five years than perhaps at any time in the past five hundred.” This should concern all citizens who care about democracy: as Julia Cagé lays out in her book “Saving the Media”, access to independent information is a public good that benefits the whole of society.
In his book titled “The Filter Bubble,” Eli Pariser describes how algorithms on social media control who gets what information. Social media platforms and disseminators of news such as Facebook and Twitter, as well as aggregators such as Google and Yahoo, have enormous power in curating the information we have access to. With every click we self-select into parallel bubbles, becoming further and further polarized in our political ideologies and the way we understand society’s problems, as well as potential solutions. According to Pariser, this personalization of information is exacerbated by our lack of awareness: we don’t have a choice of whether or not to enter the filter bubble; when we’re in it we’re not aware of how our content is being filtered; and no two users’ filter bubbles are the same.
“The Like button is our new ballot box, and democracy has been transformed into an algorithmic popularity contest.” -Antonio García Martínez
Thus the algorithms created by tech giants are the invisible gatekeepers of information that shape our perceptions and political discourse. And Pariser notes that when the information we see is based on what we usually click on first and most often, we get an unhealthy proportion of information “dessert” (of the instantly gratifying entertainment variety), instead of a balanced diet of important information and opposing viewpoints. Historically centrist US Senator Claire McCaskill makes the point that positive news stories (which don’t involve conflict, or elicit fear and anger) such as the bipartisan passing of legislation to make hearing aids far more affordable for Americans, are missing from social media due to “shifting business models in journalism”. What happens to our perceptions of the world when our feed becomes indulgent, making our searches more of a pleasurable or stimulating experience than a truly enlightening one that challenges us and makes us uncomfortable? How different is this method of gathering information from the way we were taught to research by librarians, when the internet was just taking off?
“This moves us very quickly towards a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.”-Eli Pariser
According to Antonio García Martínez, the algorithms and AI software used by Facebook to display news media definitively contributed to President Trump’s win in the last US election (far more than Russian trolls on the social media site). Because of features such as the “ad auction” that reward sensationalist and provocative click-bait material, Trump’s team was able to acquire precious ad space over Hilary Clinton’s team. Martínez explains how this process, initially developed by Google, works. “If Facebook’s model thinks your ad is 10 times more likely to engage a user than another company’s ad, then your effective bid at auction is considered 10 times higher than a company willing to pay the same dollar amount. A canny marketer with really engaging (or outraging) content can goose their effective purchasing power at the ads auction, piggybacking on Facebook’s estimation of their clickbaitiness to win many more auctions (for the same or less money) than an unengaging competitor.” He also notes that Facebook’s advertising costs are lower in sparse and rural areas of the US, making it cheaper for politicians with a voter base in these regions to reach their target audience.
Martínez further describes the Trump team’s use of Facebook’s “custom audience” and “look-a-like audience” features to target people based on their online activity outside of Facebook. This activity includes the email you use to sign up for newsletters; the postal code you use for online shopping; and any political activity like donations to a campaign. Algorithms analyze these data points to predict which users would be most susceptible to a certain type of political messaging, and these user profiles then get matched onto existing Facebook accounts to create a “custom audience.” Most of us would be surprised at how much data Facebook gets to piece together while we’re not even on the site, to help their paying customers (from companies to political candidates) develop ads most likely to successfully shape our thinking. “Facebook can also populate an audience by reading a user’s cookies — those digital fragments gathered through a user’s wanderings around the web.”
The “look-a-like” audience feature allows advertisers on Facebook to cast an even wider net, and reach a much larger social network. As Martínez explains, with a simple click “Facebook now searches the friends of everyone in the Custom Audience, trying to find everyone who (wait for it) “looks like” you.” These like-minded people are also determined by a Facebook algorithm that takes into account things such as users’ mutual engagement: the friends whose new profile picture you always like for moral support and vice versa, or those who like the same Buzzfeed articles. In addition to targeting likely Trump voters, the Trump campaign used the “look-a-like” feature to reach likely Clinton voters and fed them “engaging but dispiriting” content about her.
To be clear, Martínez notes that there is no reason to think Clinton’s campaign team did not use the “custom audience” and “look-a-like audience” features as well (which have been around for years and are mundane in the world of marketing). Perhaps Trump’s team was just better at promoting sensational stories, reaching their voter base, and doing all of this with less money. The concern is that social media is taking over traditional media as the place where we consume news and political information. And it is currently governed by secretive algorithms that we know are driven by private profit and not democratic values such as truthfulness and impartiality that we would expect from say, public broadcasters. This infringes on press freedom because we are not exposed to the diversity of ideas that exist: we are slanted towards salient content that generates the most advertising revenue.
“If we’re going to reorient our society around Internet echo chambers, with Facebook and Twitter serving as our new Athenian agora, then we as citizens should understand how that forum gets paid for.”-Antonio García Martínez
Does this impact democracy in a way that traditional news media has not? At the moment, the debate is not “if,” but “to what extent”. Researchers Allcott & Gentzkow found that the dissemination of fake news on social media was itself not enough to tip the scales in favour of Trump. But they did find that the top fake news article was shared more times on social media than the top mainstream news article in the last US election. Furthermore, the majority of fake news stories were overwhelmingly pro-Trump: the database they compiled “contains 115 pro-Trump fake stories that were shared on Facebook a total of 30 million times, and 41 pro-Clinton fake stories shared a total of 7.6 million times”. Articles from the press were filtered and magnified in disproportionate ways, due to algorithms that serve to maximize engagement and thus Facebook’s profits as opposed to the journalistic integrity that is necessary in a functioning democracy.
As the story usually goes, technology can be the problem but also the solution. Innovative thinkers are eyeing blockchain as a way to decentralize media, moving away from the above models where Facebook and Twitter rule. Civil News Media group is attempting to do this by selling cryptocurrency tokens directly to readers, who can then use them to subscribe to different publications. Civil claims to remove the for-profit “middle man” that usually influences how media is created and disseminated. Instead, it is entirely community owned and managed, as citizens can launch their own newsrooms on the platform. The idea is that if they produce original, high quality journalism, they will be rewarded financially with tokens from fellow citizens. No single entity will be in charge, and “all activity is driven by community consensus, or, a popular vote”.
Some benefits of this decentralized system include preventing legitimate news media archives stored on the blockchain from ever being deleted; using reader consensus as a “check” to identify and remove fake news articles; and directly supporting and incentivizing good journalism with cryptocurrency tokens (similar to Medium claps, if they resulted in real money). This has proven difficult: Civil News Media has only raised slightly over 1 million through this crowdfunding approach, despite their minimum 8 million dollar goal. A 2018 Reuters report on the state of the news media revealed that the public’s willingness to pay for news remains very low, especially in English speaking countries. Our stinginess may be one reason why this community ownership approach is having difficulty taking off.
Other innovative ways to fight fake news and misinformation are being developed in an arms race with technology such as “deep fakes”. The InVID Project rolled out by the European Union is like “Turn It In” (the plagiarism prevention service that most students are familiar with) for videos. InVID’s vision is to develop “a knowledge verification platform to detect emerging stories and assess the reliability of newsworthy video files and content spread via social media.” InVID technology can assess things such as whether an existing video has been taken out of context, and/or doctored to promote a falsehood, which has happened most recently in Europe for the purpose of demonizing migrants. This technology may also serve helpful in other parts of the world, as the Trump administration was recently accused of using a doctored video to revoke a journalist’s access to the White House. This video demonstrates InVID in action.
It is clear that the internet, specifically social media and news aggregators, are rapidly changing news media and the way we access it. We are not yet clear on how well our democracies will weather this development, but people are paying attention. “The Markup” is a new organization based in New York that will begin publishing stories in 2019 to critically investigate this very issue. We do know that the dissemination of information has always been a contentious issue, with fair criticisms to be made of the technology of the day. Pariser notes that although the gatekeepers of information flows are now algorithms, they used to be editors. However, Pariser argues that at least human editors were more visible gatekeepers with a clearer set of ethics, however imperfect they may have been. Furthermore, Emily Bell points out that the number of people who now control the dissemination of news, a critical aspect of public life, is very small. They are the owners of tech platform companies who never sought out to determine the fate of the freedom of the press, nor did they plan for how to deal with this responsibility.
We do know that private companies such as Google and Facebook have enormous unchecked power to decide what news media content gets valuable real estate, and which of their users see what. This has resulted in an increased propagation of fake news on social media, particularly impacting the recent US election, as well as increasing polarization. We must accept that good journalism costs money, thus it is not something to expect from our “free” social media accounts. Citizens may well disagree on what “good” journalism entails, but most would agree that it should be independent and transparent. We should move towards paying media outlets that have more transparent funding mechanisms than tech giants, if we want to maintain a healthy democracy.