Nourhanne Aoun
JSC 419 Class blog
Published in
4 min readMar 2, 2019

--

Filtered and Presented

On Algorithmical Objectivity

The sphere within which we live socially is formed by the collective public opinion and the many media outlets that shaped, filter, categorize and present this public opinion to us in ways that we, as a public, are slightly unaware of. (Caplan & Boyd, 2016). Huge media outlets from which we retrieve our news and formulate our perception of things use algorithms to present relevant and curated articles and posts. For example, the Trending page on Twitter offers us the most relatable and relevant topics that possess the most amount of mentions, retweets and likes. In that way, this algorithm (which is yet to be defined) gives us a perception of what we should assume to be the most important things happening across the Twitter sphere.

In what way do these algorithms possess an authentic reliability and validity? How can these algorithms access our data and personal history of browsing and then assume that this is the only thing we seek to know about and relate to? In that way, algorithms have affected the objectivity and overall wide-spread scope of journalistic, personal and social information.

Social media platforms and subjectively curated news outlets differ in two main ways. Firstly, social media allows the people to become the creators of their own socially consumable information, which adds a level of difference, versatility and outreach. In comparison, news outlets control what information is presented, alter the headlines of articles to resemble a catchy tabloid-like slogan to capture attention and thus retain readers, spreading these articles more than other articles that require attention. (Caplan & Boyd, 2016).

Secondly, social media consumers do not have a biased agenda for the posts that they create and post, while huge media magnates such as Facebook have a particular agenda in filtering the news that is being offered and transmitted to the audience. This is clear in the way that Facebook allowed for a huge impact on the 2008 elections results. ‘What they found was that, through manipulating their algorithms and adding features like an “I Voted” button, users who were exposed to these messages were 0.39 more likely to vote than those in the control group, effectively mobilizing a potential total of 400,000 voters to cast their ballot in the 2010 midterm elections.’ (Caplan & Boyd, 2016). This is extremely effective and radical for cultures. However, the question remains, do these softwares serve an ethical purpose by being programmed and designed with a purpose? Are cultures being made to follow the perceptions and guided outlines of the people in control of these algorithms and how can we achieve algorithmic objectivity to ensure an unbiased, fully-scoped and informational news service to the public sphere?

(Christl, 2018)

The above diagram presents an interesting and detailed process of how Facebook processes its information before presenting it. Just like Facebook, Google claims its algorithms examines over 200 signals for every query before presenting the most relevant. And therein comes the problem, algorithms have sets of criteria that determine what is relevant and what is not. This brings about the negative issue with algorithms because they dictate what is relevant to users who did not choose that relevance criteria, they are engineered to spread news that is being paid for, they ultimately change and mould public opinion, they reshape user practices involuntarily, they lead to misinformed reports on user statistics, they eschew other sites and other important pieces of news that otherwise needs to be heard, shared and commented on. (Gillespie, 2014).

Algorithms, in short create a form of restraining limitation on news sources, creating an agenda-based form of news communication. In doing so, they also abolish user privacy. As Gillespie states, ‘. Sites hope to anticipate the user at the moment the algorithm is called upon, which requires knowledge of that user gleaned at that instant, knowledge of that user already gathered, and knowledge of users estimated to be statistically and demographically like them.’ (Gillespie, 2014). The problem here presents itself in making the users formalize themselves into categories and gradually become informed by what they are given, not what they seek.

In conclusion, this evaluation has allowed me to see that algorithms are privacy-hacking and knowledge-limiting softwares. Algorithmic objectivity is a must. To achieve that, people need to grow aware of how algorithms work, they must set their own criteria, search results must become engineered in a way that presents extensive variety, news outlet must not be in control of algorithm design, users must be the ones judging what is relevant and what is not, companies and political parties must not have a say in algorithm design and finally, users must be made aware that ‘algorithms are made and remade in every instance of their use because every click, every query, changes the tool incrementally.’ (Gillespie, 2014).

__

References

Gillespie, T. (2014). The Relevance of Algorithms. Media Technologies,167–194. doi:10.7551/mitpress/9780262525374.003.0009

Christl, W. (2018, April 29). Facebook, Twitter, LinkedIn and Instagram news feed “algorithm ranking signals”, as interpreted by @stedavies. Retrieved from https://twitter.com/wolfiechristl/status/990687504809459712

Caplan, R., & Boyd, D. (2016). Who Controls the Public Sphere in an Era of Algorithms … Retrieved from https://datasociety.net/pubs/ap/MediationAutomationPower_2016.pdf

__

--

--