Confirmation Bias Inflamed in Echo Chambers

Hisham Fakhreddin
3 min readNov 19, 2017

--

The echo chamber phenomena seems to be on a rampant rise in recent years, ever since the launch social media platforms such as Facebook and Twitter. One of the things that occur behind the scenes on Facebook for example is its content algorithm, where content is shown on your news feed based on which pages you like and follow or through your online friends and the posts that they share to you. This has created a world full of eco chambers within Facebook where each and every user’s confirmation bias is reinforced. By definition, confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms one’s preexisting beliefs. Or in this case it is to like, comment, and most importantly share content that confirms your own preexisting beliefs which in turn goes around your social media circle that’s includes people who have beliefs similar to your own. All of this widens the distance between the different eco chambers in social media as every group shares information supporting their own beliefs without being exposed to different points of view.

In 2014, survey in the United States was conducted by the Pew Research Center showed that ““Consistent Conservatives” were twice as likely as the average Facebook user to say that posts about politics on Facebook were “mostly or always” in line with their own views, and that four in 10 “consistent liberals” say they have blocked or unfriended someone over political disagreements.” The real issue with confirmation bias and how it is reinforced in the echo chambers was clearly described by Alessandro Bessi, a postdoctoral researcher with the Information Science Institute at the University of Southern California: “Users show a tendency to search for, interpret, and recall information that confirm their pre-existing beliefs.” This is called “confirmation bias,” and Bessi says it’s actually one of the main motivations for sharing content.” This causes a surge of misinformation throughout the different echo chambers or communities with in Facebook even though the users can be within the same geographical area. Another issue that was discussed in the 2014 survey is that due to the simplicity of following or unfollowing people on social media platforms enabled its users to block or unfriend other users who have different points of view which in turn causes the user to enhance their own beliefs by only interacting with people who share similar beliefs.

President Bill Clinton once said: “We are more connected than ever before, more able to spread our ideas and beliefs, our anger and fears. As we exercise the right to advocate our views, and as we animate our supporters, we must all assume responsibility for our words and actions before they enter a vast echo chamber and reach those both serious and delirious, connected and unhinged.” As social media platforms are constantly tackling the issue of the spread of misinformation by enhancing their algorithms, this issue will not be resolved until users start to fact check the piece of news they about to share and to keep an open mind on different points of view.

References

Media, DNN. “Social Media Echo Chambers and Our Own Confirmation Bias.” Medium, Dnnmedia, 21 Sept. 2017, https://medium.com/dnnmedia/social-media-echo-chambers-and-our-own-confirmation-bias-fcd89d7fa11c

Bixby, Scott. “‘The end of Trump’: how Facebook deepens millennials’ confirmation bias.” Theguardian, 1 Oct. 2016, www.theguardian.com/us-news/2016/oct/01/millennials-facebook-politics-bias-social-media

Willingham, AJ. “How Facebook actually isolates us.” CNN, Cable News Network, 22 Jan. 2017, www.edition.cnn.com/2017/01/22/health/facebook-study-narrow-minded-trnd/index.html

Emba, Christine. “Opinion | Confirmed: Echo chambers exist on social media. So what do we do about them?” The Washington Post, WP Company, 14 July 2016, www.washingtonpost.com/news/in-theory/wp/2016/07/14/confirmed-echo-chambers-exist-on-social-media-but-what-can-we-do-about-them/?utm_term=.cc07f96f27ea

--

--