Are algorithmic Media harming the principle of objectivity?

Ricardo Trialaccount
JSC 419 Class blog
Published in
5 min readMar 4, 2018

“Modern journalism ethics was built upon the twin pillars of truth and objectivity” (Ward, 2009). Nowadays, we live in a world where humans are no more able to have full control on massive media. Therefore, there has been this enormous reliance on algorithms to provide editorial decisions which mainly threatens the principles of objectivity. News should be unbiased, relying mostly on facts without being influenced by neither opinions nor comments, and by not empowering or marginalizing a side. In bottom line, it should be neutral. This attainable “myth” of objectivity provides more benefits to society and somehow represents and reaches all different parts of it. Our world however, is not perfect, and “algorithmic” objectivity is threatened/threatens by primarily 3 factors: first, the widespread skepticism of the truthfulness of all sorts of news especially on digital media. Second, that all of these media are profit-seeking foundations. Third, the belief that non-objective journalism is more interactive and beneficial for those media since people will start talking and debating and therefore using more those medium (Ward, 2009, p 71).

If we were to take a close look on how those algorithms work, we find that they are very complex, delicate and changeable. Google is one of these companies with the most complicated algorithms, yet it is the most trusted source of information. If we try to understand the reason behind that complexity, we need to go back to that famous saying that “the truth is pure and simple” yet it’s all about the perspective from which it’s being told. This is where Algorithm interferes to filter, sort out and display the information according to the user’s request and to the bank of information. They also consider and keep a close eye on the user’s personal information which that he already has accepted to share the minute he started using the program. It’s about thousands of lines of code, hundreds of factors that define which link is going to appear on top and which link is going to appear in the bottom, what to be dropped and what to be shown and how to be shown and edited. This is why each couple of years Google modifies those factors in ways that suits better its clients and the occurring change in the world. However, the biggest uncured problem for this algorithm is that it cannot read nor analyze the content as a human!

It stills a machine following orders, but that flaw is buried by the fact that Google provides its users by a diversity of options so that they “feel” in power and in access to different valuable information. This is another big flaw in the system, news are never impartial if they were viewer-centered or by anyway taking the viewer into consideration. These algorithms are somehow also fixed (as codes) and can be manipulated. On a larger scale, people like Robert Mercer, invest billions of dollars to alter the decisions made by algorithms and to understand how it works in order to promote their interests. They create big media companies as cover so that they dissolve into the system to fulfill their goals. This has gone too far especially in social media that some people actually started majoring and working in fake news industry. Fake news can be spread more quickly nowadays; a small kid can disable a nation by posting a lie, a threat, from a fake account and algorithm cannot stop him. Another disadvantage of algorithm media is that sometimes it filters the news in a specific way for each user and by that it does not give him the full image. In bottom line, there are different ways to force the algorithm out of objectivity. Now let’s have a closer look on social media like Twitter and Facebook, these companies also use algorithms and try to be unbiased and impartial. They can’t control however their materials and what’s trending, what’s main news. Important news sometimes are being undervalued or completely disregarded due to other events happening at the same time. For example, #OccupyWallStreet on twitter, one of the largest protest movement in the 21st Century in the USA, didn’t make it to the top ranking on Twitter and got simply surpassed by hash tags likes Kim Kardashian’s wedding. In a more dangerous approach, malicious and dangerous messages could be embedded in a very unnoticeable way in the news provided by social media. According to THE WIRELESS (2016), Microsoft has released in April 2016 a robot called Tay with high artificial intelligence yet neutral minded. The purpose was to stimulate, develop and increase social intelligence by interacting with users on Twitter as if she’s a normal American teen girl. The results were beyond choking as in a very short time, Tay developed some very racist, outrageous attitudes. She started to hate feminism and Jews. She said that Bush did 9/11 and Hitler would have been a greater ruler than any president. Finally she tweeted that Donald Trump is the only hope for the world and Tay was immediately shut down after that. This is a great proof that the algorithm is providing biased and not impartial news and there is no way to deny that unless if Trump hadn’t won the elections or at least sparked that outcry.

Mathematician Cathy O’Neil says the algorithms we expect to be fair can actually increase inequality

To sum things up, we cannot apply calculated formulas for ethics; each case is special by being different and needs different analysis. The algorithm can never act like a human being, but it is a must to use it due to the enormous amount of information that floods the web. So if there is a way to solve that issue, it is in my opinion to raise awareness about the subject and to punish all the fake/malicious news providers. The user should know about algorithm and how it works, that way it will become his responsibility to look through different sources with different perspective, in order to get the most out of objectivity.

References:

S. (2010, July 25). Googles Algorithm And Problem. Retrieved on February 28, 2018, from https://www.youtube.com/watch?v=hVolWe_chB8

Ward (2009) ‘Truth and Objectivity’ in in Wilkins & Christians (eds.) Handbook of Mass Media Ethics, Routledge, London; New York, pp. 71–83; p. 71

Weapons of math destruction? The problem with algorithms. (n.d.). Retrieved on February 28, 2018, from http://thewireless.co.nz/articles/weapons-of-math-destruction-the-problem-with-algorithms

--

--