Are algorithmic media harming the principle of objectivity?

Mabelle Abbas
JSC 419 Class blog

--

Ever since we were little children, our parents taught us to listen to both sides of a story before judging, and standing with what we believe is right comes later on. They taught us that in order to listen to both sides we should restrain from being bias and focus on being as objective as possible. Nowadays, being objective is one of the hardest things to do, since the search engines we use and the algorithms they have created are paving a road to what they think we want to hear and read. Our every move is being tracked and by using that information, those search engines are categorizing what they think we will look at and completely forget about the other side of the story. The real question here is: Are algorithmic media harming the principle of objectivity?

First things first, how is objectivity defined? According to Bruckner, a report is objective if “it is a factual and accurate recording of an event; it reports only the facts and eliminates comment, interpretation and speculation by the reporter and is neutral between rival views on an issue.” This is how objectivity is defined, the slightest comment from the journalist will turn the article subjective and will push it towards one side more than the other. However, people were fast to complain. Henry Luce, the founder of Time magazine has stated “Show me a man who thinks he’s objective and I’ll show you a liar”. Three factors are here to show why objectivity isn’t as ideal. Starting off, objectivity was criticized for being too demanding and for being a myth. The second argument was about objectivity being undesirable for pushing writers to use restricted formats. Thirdly, it restricts the freedom of the press. The main idea is that no person can ever be objective because journalism isn’t based on objectiveness, it is based on the freedom of speech and the delivery of information which some people think cannot be done under the strict rules and formats demanded.

Where does objectivity stand now that the world is ruled by social media? If what we search for objective or are we being controlled with our own information used against us?

According to Lotan in the new ethics of journalism, Algorithms are “a finite list of instructions that a machine performs in order to calculate a function (i.e. search, recommendations, friend suggestions), a string of commands programed to execute defined tasks, a piece of computer code that transforms input data into a desired output, based on specified calculations.” Algorithms were used to help us solve mathematical problems in school, nowadays they run our every day life. Algorithms use our own information against or with us. They track our searches, our likes, our dislikes and use them for their own profits and ours. When we post our information on sites such as Facebook or Instagram, the information belongs to them now and is no longer private. Companies buy that information and use it to their advantage to advertise, and sell their products.

In the documentary Trumping Democracy, the Robert Mercer scandal was shined upon. He bought Breitbart news and put Steve Bannon as the manager of Trump’s campaign. The outrage that happened was due to the fact that this company used data from Google, Facebook, banks and so on in order to identify which voters would vote for trump and which weren’t. They also used a hidden Facebook feature named “dark post” in which they showed manipulative and personalized messages to millions of people, but the messages would disappear later on. How was that possible? Algorithms of course.

According to Gillespie: the technical character of the algorithm is positioned as an assurance of impartiality. Yet the objectivity and impartiality of information sources are limited by our own prejudices and pretensions because what we search for and read feeds directly into what algorithms play back to us. In other words, we create out own filter bubbles and algorithms help us stay inside them.

Furthermore, Algorithms are creating more and more problems. Search engines gained such power over the years that they almost created a life of their own. They take control and pride in shaping our personalities, our ideas, personalities, dreams and networks. People trust the search engines to tell them which information is correct as long as its fits with their beliefs. These search engines base their ‘searches’ on quantitative measurements rather then qualitative measurements. Using instantaneous tracking of preferences and behaviors such as trends and likes, algorithmic media send us the content. In other words, the more frequently a site is visited, regardless of the accuracy of the information, the more likely it is to pop up within the first few results. For this reason, we have been told over and over by every single scholar to never look for our information in the first page of google but rather the third or fourth. Companies such as Google, Twitter and Facebook have been committing some fallacies. Unimportant topics have been circulating overshadowing important ones. Let us take a recent example, the bombing in Syria that killed over 400 people. Why hasn’t this ‘hashtag’ been trending? Simply because people haven’t been talking about it as much, and since they haven’t, then this devastating event is not going to show. Is the reveal of Kim Kardashian’s new baby more important than the life of a 150 innocent children?

In a nutshell, there is no objectivity left. Social media harmed most aspects of it. We, as users, have our own information used against us no matter how we try. We, as users, need to learn to keep our private information to ourselves and not to the world. The information we post isn’t ours anymore and can be used to help horrible people such as trump rise to power. We are selling ourselves without even realizing and we are trusting search engines that study us like laboratory rats to help us get through our lives. Social media companies should have disclaimers that people can read, that are not over 200 pages long, that explain the consequences of our actions.

References

Baughman, J. (1987) Henry R. Luce and the rise of the American news media. Boston: Twayne.

Bruckner cited in Ward (2009) ‘Truth and Objectivityin Wilkins & Christians (eds.) Handbook of Mass Media Ethics, Routledge, London; New York, pp. 71–83; p. 73

Huchon, T. (Director). (2017). Trumping Democracy [Motion picture on Online]. United States Of America: Spicee.

Lotan, (2013) Networked Audiences, in McBride, K. & Rosenstiel, T. The new Ethics of Journalism , Sage, London, p. 4

Tarleton Gillespie “The relevance of Algorithms” in Media Technologies, ed.

Tarleton Gillespie, Pablo Boczkowski, and Kirsten Foot. Cambridge, MA: MIT Press, p. 1

Tedx Talks. (Producer). (2016, January 11). How Social Media Algorithms are Stifling Innovation | Rhea Jain | TEDxYouth@SAS [Video file]. Retrieved February 28, 2018, from https://www.youtube.com/watch?v=0qCI_ow1Y24

Ted Talks. (Producer). (2015, December 7). The moral bias behind your search results | Andreas Ekström[Video file]. Retrieved February 28, 2018, from https://www.youtube.com/watch?v=_vBggxCNNno

--

--