Are algorithmic media harming the principle of objectivity and balance in news reporting?

Rawan Al Shaikh
JSC 419 Class blog
Published in
5 min readOct 19, 2018

Current generations know Wikipedia the way they know their own names. However, they tend to forget the algorithmic aspect behind it. I, as not only a user of the media but also as a young journalist and a student in this field, strongly believe that algorithmic media are harming two of the most crucial principles in news reporting: objectivity and balance.

In news reporting, objectivity and balance are essential components to have a true and credible news piece. To add, transparency and fairness should be ensured. But, with algorithmic media, these two principles are being harmed. The concept of journalistic objectivity has 3 meanings, ontological, epistemological and procedural. Journalists describe things the way there are, support their claims with sources, methods, evidences, and claim they balance views and treat sources fairly (S.J. Ward, Truth and Objectivity, 2009, p73). Objectivity doesn’t mean that journalists are free of outside bias but that the methods they are using is objectively consistent and accurate, so personal biases would not undermine their work’s accuracy.

An algorithm is a finite list of instructions that a machine performs in order to calculate a function; it is a piece of computer code (Lotan, 2013). Algorithmic media has changed what counts as knowledge and what is considered relevant. It allows media content to be delivered instantly through the tracking of the audience’s preferences and behaviors.

Facebook is a huge social network used daily by two billions people in the world. Every user follows pages, has numerous friends and reacts to content posted on the platform by other users. Everybody can publish their own opinions, way of life and seeing the world. There are far too many posts, status, photos posted by friends and followed pages for the user to see them all: this is where the News Feed, or « Wall » enters the plot. Since 2006, users don’t have to search for information or friends anymore. The network is suggesting them content that they may like or want to know: in brief, what they will be more likely to react to.

Since “algorithms play an increasingly important role in selecting what information is considered the most relevant to us (Gillespie, Boczkowski, & Foot)”, we are no longer able to fully choose what direction we wanted. With these selections, the audience is subjected to the limited selected topics that the search engines provide. For example, if you google the word “Sun,” you will have the chance to pick the link that is most appealing to you, however, the choices of links available weren’t made by you.

Also, we must understand the concept of categorization that comes along with algorithmic media’s databases; As soon as we have categories, then we have opinions, views, decisions, and different options, which means that objectivity is far from being healthy. As algorithms calculate what is “in” and what is “out” all we do is pick from its own choices. It is not only objectivity and balance that it is harming, but also our freedom. You are forced to choose among two options when sometimes, the third would’ve been your favorite, but then again, you had no idea it existed.

Search of the word white on Google

For example, I searched for the word white on Google and this is an illustration of what the algorithms gave me. One of the main options resulted was a place holding the word white in its name, located near me in Beirut. This not only proves that I’m monitored, but also mind swaps me through marketing for places to go to, which had already paid a lot to appear on SEOs.

We need to ensure transparency and a fair representation of all voices in society; for sure. I suggest that we would have to provide explanations to each document, link, page, picture, tool, or even word served on the page. Society deserves an explanation — it needs to understand why Wikipedia pops first, why this certain picture is on the first two pages, and why some links are at the end of the page. I have the right to know why Facebook shows certain things on my news feed. Why? Because we all surfed the first two pages of Google thinking the most relevant information lies in there, and ended up disappointed or thrilled. It pisses me off, for example, how Google continues my sentence. I know what I want; I don’t need you to tell me.

Speaking of Wikipedia, with around 19 billion page views a month, it is one of the most surfed websites online. However, no college professor has ever allowed me to use it as a source at the end of my essays. “It is not credible.”. Now I understood that the reason it is not objective, balanced, and transparent is because it can be edited by anyone. Well, a question that we must consider, isn’t absolutely everything written, filmed, edited, or produced are somehow, even if just 0.00001% bias?

References

Agrawal, AJ (2016). What Do Social Media Algorithms Mean For You? https://www.forbes.com/sites/ajagrawal/2016/04/20/what-do-social-media-algorithms-mean-for-you/#764a18f3a515

G. (2013). Attention and Data-Informed Journalism. (Network Audiences, Ed.) London, Sage: McBride, K. & Rosenstial.

Gillespie, T. (2014). Facebook’s algorithm — why our assumptions are wrong, and our concerns are right. Culture Digitally.http://culturedigitally.org/2014/07/facebooks-algorithm-why-our-assumptions-are-wrong-and-our-concerns-are-right/

Jonathan Band and Jonathan Gerafi (2013) Wikipedia’s Economic Value; Buzzsumo / Kissmetrics ‘https://blog.kissmetrics.com/which-social-accounts-matter/”

Lotan, G. (2013). Networked audiences. McBride, K. & Rosenstiel, T. (eds.), The new Ethics of Journalism , Sage, London

Ward, S. (2009). Truth and objectivity in Wilkins & Christians (eds.) Handbook of Mass Media Ethics, Routledge, London, New York

--

--