The Echo Chamber, Fake news, and Information validity

Toby Pitts
Wonk Bridge
Published in
6 min readMar 14, 2017
This innocuous bubble contains 3 billion people, 15 billion things, and loads of subjectivity

We live in a time of fundamental change. On one hand an increasing percentage of the world (around 40% of the global population )[1] has access to Internet technology and bountiful information. However, we are also living in a time where the very foundations of this information are being challenged[2]. With more people gaining access to and creating knowledge, there is the potential for a multiplicity of representations of socio-political events to be broadcast, through blogging, vlogging and livestreaming. This concept can be empowering, as it allows more and more people to get their voices heard in the different forms, from polls to Citizen journalism, which potentially shifts political debate toward an enlightened sense of individualism[3].

The Amateur Truth

Yet in reality the complexity of knowledge consumption and creation online can raise questions about the very foundations of validity and truth. Correctly striking a balance between amateurism and legitimacy is a test for knowledge dissemination platforms in the social media age. The professional journalists and the news groups used to be the sole parties assigning value to news, but in a postprofessional age consumers are the ones who ‘decide what counts as news and how the stories should be told’[4]. New platforms allow for news to be sourced in different ways, which is a trend reinforced by a huge increase in the UK population who consume news online; (Reading online news, newspapers or magazines showed a large increase, 60% of adults in 2016, 3 times the proportion doing so in 2007 (20%)[5]. This trend is set against the backdrop of an increase in populist political groups worldwide and especially in the west, a movement of retaliation against expertise and centralised control[6].

Echo Chamber- How you’re involved

For online social knowledge consumption one of the key issues that can trouble the foundations of ‘truth’ is the concept of the Echo Chamber. The Echo Chamber has long been relevant to the field of acoustics, defined as the term for a space that allows for the reverberation of sounds. However, in the context of social media an Echo Chamber of ideas and thoughts can be created when users of certain opinions gravitate towards others of the same opinion. Gravitating toward others with similar ideas is a natural social process[7], but this process is exaggerated by many factors of online social tools and spaces.

Double Edged Algorithm

Technologically, algorithms are key to understanding how content is shaped on social media. In the case of Facebook, as we ‘like’ and share information that intelligence is processed into the algorithmic equation that makes up the content of our News Feed preferences[8]. Facebook’s EdgeRank algorithm has three core elements; user affinity, content weight and a time based decay parameter[9].

Although Facebook has publicly revealed these core components, the exact details of the algorithm are not publicly visible, meaning that the content curator remains an un-relatable mathematical entity to most everyday users. The algorithm is also constantly being tweaked which makes any kind of transparency difficult[10].

Synthetic and hyper-personalization are therefore perpetuated by user inputs (such as reacting to content[11]) and by algorithmic equations. At any one time the results of a Facebook, or Google search by separate users will result in a different list of results [12]. Media behemoths such as Facebook consequently become like gatekeepers to the vast amount of knowledge found online by surrounding information in a technological shell.

As algorithms such as EdgeRank adapt to our likes and predilections the already present trends in our social groupings are accentuated. If I’m a fervent fan of watching instructional cooking videos (Buzzfeed’s Tasty Videos, for example[13]) I will begin to see more information that reflects the same trends and less of that which does not. In this context the system doesn’t seem too tremendous yet if instructional cooking videos are replaced by extreme political propaganda the effect of the Echo Chamber becomes more apparent. There is an inherent danger in only hearing opinions that reflect your own and justification from our social media peers can lead to abnormal rationalization.

Clean your Data

In the Digital Humanities truly valuable data can often be categorized as data with context. What the Facebook timelines and the twitter feeds do is transpose the data or the news article from the original context that it was written in, which creates a dissemination of information. As we skim-read these articles we are undertaking vastly different processes than if we were reading or watching the news in a more traditional form. If the focus of news on social media is placed on the immediacy of the interaction, rather than the frame and contextual information then the inherent data value is side lined, as is the emphasis on source and point of origin.

On sites like Facebook, data, noise, opinions and fact all intermingle in a torrent of information that is fed back to us as users. Satire (I.e The Daily Mash, The Onion) is coalesced with more traditional news and pseudo news sites and as we scroll through a Facebook News Feed the array of information on offer creates only transient moments of connection and contextualisation. From a neurological perspective we are becoming bi-lateral readers, able to process and skim information quickly. This style of reading does however come at the cost of deeper reading techniques[14]. If our online lives and social interactions exist in this kind of space and if the very way in which we read is being altered by the platforms we read on, the validity of what we interpret becomes hard to gauge. It is also hard to maintain objectivity to the processes we undertake when social media is such an intrinsic part of our day-to-day lives.

Can you widen your vision?

When the Echo Chamber effect is combined with the lack of validity and source scrutiny social media can become a powerful tool to spread disinformation[15]. Should the responsibility of quelling the fake news trend perhaps fall on tech companies to create technologies that prioritize validity, context and authenticity?

Should it fall on educators? Tim Cook has called for ‘the modern version of a public service announcement campaign’[16] and certainly education has a big part to play in fostering objectivity toward news sources. This is an idea that has led to ‘Fake news’ awareness classes in Schools[17]. Tools such as Politifact.com also aim to debunk fake news. You can also find Wonk Bridge’s interview with Alessandro Gandini — a leading Digital Ethnographer — on the topic below.

Or perhaps we as users need to make conscious efforts to become more objective in our knowledge consumption. Ultimately if the technology that underpins how we consume and share our news doesn’t change, or isn’t forced to change, the problem of fake news runs the risk of being perpetuated by the digital tools that are so central to our social activity.

Toby Pitts is a student of the King’s College London Department of Digital Humanities. You can follow him on Twitter here:

[1] www.internetworldstats.com/stats

[2] Oxford Dictionary’s decision to make ‘Post-Truth’ 2016’s word of the year is perhaps the most telling sign of this.

[3] Downey, E. & Jones, M. A. (eds.) (2012) Public service, governance and Web 2.0 technologies: future trends in social media. Premier reference source. Hershey, PA: Information Science Reference.

[4]http://www.jstor.org/stable/10.5406/j.ctt18j8wqx.4?seq=4#page_scan_tab_contents

[5]https://www.ons.gov.uk/peoplepopulationandcommunity/householdcharacteristics/homeinternetandsocialmediausage/bulletins/internetaccesshouseholdsandindividuals/2016

[6] The stigma surrounding ‘So called- Experts’ by the Trump administration and the retaliation against ‘experts’ in the EU referendum are prime examples of this political sentiment.

[7] http://www.telegraph.co.uk/news/science/science-news/3336375/We-prefer-people-we-think-are-similar-to-ourselves.html

[8] http://time.com/collection-post/3950525/facebook-news-feed-algorithm/

[9] http://edgerank.net/

[10] http://wallaroomedia.com/facebook-newsfeed-algorithm-change-history/

[11] http://newsroom.fb.com/news/2016/02/reactions-now-available-globally/

[12] (http://ed.ted.com/lessons/beware-online-filter-bubbles-eli-pariser)

[13] http://nymag.com/thecut/2016/03/zen-and-the-art-of-the-buzzfeed-tasty-video.html

[14] https://www.pri.org/stories/2014-09-18/your-paper-brain-and-your-kindle-brain-arent-same-thing

[15] http://www.bbc.co.uk/news/blogs-trending-38156985

[16] http://www.telegraph.co.uk/technology/2017/02/10/fake-news-killing-peoples-minds-says-apple-boss-tim-cook/

[17] http://www.theglobeandmail.com/news/world/the-new-civics-course-us-schools-teaching-students-how-to-spot-fake-news/article33995852/

--

--