Objectively Biased

Yara Issa
JSC 419 Class blog
Published in
5 min readFeb 27, 2019

Today’s media landscape is very different than it was a few decades ago. Legacy media has been deprioritized in favor of newer algorithmic media. Legacy media, in the form of radio, tv and newspapers was how people used to receive most of their information. It was created by people, always human editors, and distributed to everyone equally. Everyone had the opportunity to be exposed to whatever content they wanted and they could choose what information they received.

With the growth of social media and algorithmic media, that all changed. Companies like Facebook and Youtube could use algorithms to find a user’s preferences and then use that information to organize what content was shown to them on their sites. For the first time, what one person saw on their screen could be completely different than what another saw on theirs even if they were looking at the same website. This is problematic for objectivity, since objectivity is defined by Merriam-Webster as “lack of favoritism towards one side or another” and “freedom from bias.” When you have an algorithm that will only give you the content it thinks you want without caring about other types of content then there is a clear sign of favoritism. At that point, the algorithm is making the decision of what is interesting enough to show on behalf of the viewer. This is dangerous for an open and democratic society because a democracy is led by the people, but if the people are misinformed or trapped in ideological bubbles then it hurts their ability to think without bias and get to fair solutions.

Even so, was legacy media removed from bias to begin with? Or is true objective media impossible? Gillespie (n.d.) sheds some light here when he says that “journalists use the norm of objectivity as a ‘strategic ritual’ to lend public legitimacy to knowledge production tactics that are inherently precarious” (p. 15). Knowledge production is “precarious” because humans are inherently biased and publications, as companies run by humans, are also biased. American news outlet Fox News is widely known for being mostly conservative while other famous publications like The New York Times are more liberal. Without algorithmic media, huge news aggregators like Facebook would inevitably end up leaning towards a specific side. Instead, the algorithm gives the user what they most likely want to see, regardless of what “side” they’re on. While that is problematic in its own right, it can also be seen as more objective than the alternative. However, that illusion falls apart when you think about the fact that at the end of the day, the algorithms are still constructed by humans with bias and so they can be biased too. Google has historically “altered their search results” whenever large events happen like “a racist Photoshopped image of Michelle Obama at the top of the image search results” (Gillespie, n.d., p. 15) which shows that Google has the power to subtly change its algorithm to affect results. True objectivity cannot be achieved as long as humans are behind the scenes calling all the shots.

While social media has done a lot for us as a society, its future might not be as happy as its past. Social media brought us all closer together and allowed us to connect globally in ways that did not exist before. It also allowed for the democratization of content which meant that anyone with an opinion could get their voice heard online. That is both a good and a bad thing because it is good to have a system where people can be heard, especially in a democracy, but publications exist as a way of standardizing quality and ensuring that anything published was factual and digestible. Without publications safeguarding against bad, baseless opinions, there is no quality control on content and the average quality falls. Couple that with the fact that, due to the way trend algorithms work, “the true bottle neck is no longer what gets published but rather what gets attention” (Lotan, n.d., p. 106) and you have a recipe for disaster. A system that rewards attention instead of journalistic integrity is flawed because it encourages content creators to write sensational clickbait content instead of well-researched and supported articles. A man who makes up a story about seeing a ghost is going to generate more attention that someone writing a researched article about the oil economy because it is wilder and more fun, even if the article on the oil economy might be more important for people to be informed about. Placing value of attention instead of integrity is dangerous and it makes the quality of the information we receive very bad.

Overall, I learned that algorithms aren’t as smart as we like to think they are because humans are still controlling them. We also haven’t fully cracked to code to develop a perfect vertical search so for the next foreseeable future, search will still need continue to be manipulated by humans depending on context. While a newsfeed that knows that you want sounds very nice and convenient in real like, the reality is that such a newsfeed can threaten democracy be polarizing the content you receive. In that sense, the newsfeed can control what you see and consume, and to an extent, think. It is hard to achieve “algorithmic objectivity” but one of the ways to get there would be to give our current algorithms the ability to understand that diversity of opinions is healthy. Developing tools that prevent ideological bubbles and making sure that people get fed content that can help them grow as people and find new ideas would go a long way.

References

Gillespie, T. (n.d.). The relevance of algorithms. Media Technologies, ed. Tarleton Gillespie, Pablo Boczkowski, and Kirsten Foot. Cambridge, MA: MIT Press

Lotan, G. (n.d.). Networked audiences. The New Ethics of Journalism, ed. Kelly McBride, and Tom Rosenstiel. Sage.

Merriam Webster Dictionaries. Objectivity. Retrieved from https://www.merriam-webster.com/dictionary/objectivity

--

--