Is objectivity attainable through Algorithms?

Nadine Safi
JSC 419 Class blog
Published in
5 min readOct 11, 2016

The media are the main source of information for the public and play a very large role in shaping peoples’ views and mentalities. Objectivity is the complete detachment and lack of influence from personal feelings and opinions when reporting a subject; in other words, it is a factual and accurate recording of an event. As defined by Ward, “traditional objectivity is the idea that reporters should provide straight, unbiased information without bias or opinion.” (Ward, 2009. p.71) The media are expected to be unbiased and objective when delivering and distributing important news, in order for people to form their own judgments on the matter rather than being spoon-fed what they should judge and think. Journalists should report without involving their personal beliefs and should not encourage more socially constructed view points on their audience. Unfortunately, media objectivity is a myth, because it is impossible to be completely unaffected and uninfluenced by values and opinions.

Objectivity is criticized for being uniform, in the sense that it reproduces dominant ideologies and points of views that are being portrayed the same way to all audiences. As stated by Ward, “objectivity, even if possible, is undesirable because it forces writers to use restricted formats. It encourages a superficial reporting of official facts.” (Ward, 2009. p.75.) That is, if journalists were actually being objective, the content they produce would be a robotic formulation of facts rather than an interpretation of the news. Another way Ward puts it is, “Objectivity ignores other functions of the press such as commenting, complaining and acting as public watchdog.” (Ward, 2009. p.75.) which basically says that journalists wouldn’t be doing any part of their actual jobs if they were 100% objective because that would require them to completely detach themselves (emotions and reasoning) from the situation.

Algorithms are a finite list of instructions, a string of commands, programed to execute defined tasks in a specific way. In this same way, these computer algorithms have taken over the role of editors since search engines and media sites create news based on mechanics that run them; algorithm chooses what is “relevant” by referring to what is most viewed and chosen. Because these systems are aimed at finding trends and patterns, they cannot be considered objective since it is a selection of pertinent information based on previous data collection and analysis. Algorithms are giving a false interpretation and representation of news to the public, by only focusing on the trending topics and highest ratings. They are disregarding the pressing news stories, which conflicts with the news worthy characteristics. For example, #OccupyWallStreet was a historical event that happened in New York, but it never trended on Twitter because it got overpowered by #KimKWedding.

Gillespie confirms that, “Algorithms play an increasingly important role in selecting what information is considered most relevant to us, a crucial feature of our participation in public life. […] Search engines help us navigate massive databases of information.” (Gillespie, 2012. p.1.) We could argue that Gillespie’s use of the word “help” is not actually correct because algorithms expose us to the more popular information, so it is not exactly helping us navigate through the web. For example, most users never get past the first or second page on the Google results because they are used to thinking that the most relevant information is at the top, which in most cases is rarely accurate or reliable.

A more proper term is “assumption”, which is what these systems do when it comes to our searches and feeds. Algorithms follow a set of guidelines including patterns of inclusion and cycles of anticipation (Gillespie, 2013), and it is these patterns choose whether data is included or not.

Gillespie underlines that Google is the most particular and careful when it comes to neutrality. (Gillespie, 2012 p. 14). They are introducing the ability to have automatically personalized results that are tailored to our specific interests and needs instead of having them generalized for all audiences. Google seem to be listening to the complaints of the users and fixing what is criticized. For example, as pointed out by Gillespie, “they provide a SafeSearch mechanism for keeping profanity and sexual images from minors; and they refuse to autocomplete search queries that specify torrent file-trading services.” (Gillespie, 2012. p. 15.) But in fact, Google believe that mechanics are smarter and more reliable that human manipulation of search engines, which is why their definition of neutrality is allowing the engine to work based on its own system and program, and then stepping in to alter what can controversial. This can have its downsides since if shifts emphasis away from actual content to a set of combinations of links and keywords that fit into frames of the algorithm which in return provide us with results. This statistic approach to knowledge and robotic way of receiving information can disrupt relevance and thus, manipulate and shape the user’s opinion differently.

In conclusion, objectivity in traditional media was already impossible to have, but with algorithms claiming they can be completely neutral with the information they provide, we no longer want objectivity to be a thing in our news distribution. The reason for that being that the way in which these mechanics select and divide information is still an infringement of ‘objective’ selection because they do not allow the user to have direct full access to all sources, and instead only limit them to ‘often-visited’ links. These trending topics and results will catch the user’s attention and defer it from his initial expectations; algorithms are tricking him into following the pattern seen in data analyses, which will evidently restructure the user’s view points on the subject.

Resources

Gillepsie, T. (2012). The Relevance of Algorithms. Media Technologies.

Ward, S. (2009) Truth and Objectivity. Wilkins & Christians (eds.) Handbook of Mass Media Ethics, Routledge, London; New York.

Unlisted

--

--