Fiyin A.
Musings
Published in
2 min readNov 23, 2018

--

Echo chambers, filter bubbles, opinion mining, and opinion manipulation (Eric Gilbert, Tony Bergstrom and Karrie Karahalios)

How agreement and disagreement were measured in this paper?

1.) Two researchers sampled 1094 comments from Technorati’s curated list, consisting of the top 100 most important and influential blogs in the world. These human raters independently assessed comments and classified them into, agree, disagree, and neither.

Beginning from an intuitive understanding of agreement, the raters coded 10% of the data and then stopped to check inter-rater reliability via Cohen’s , a standard measure.

2.) A completely automated approach: They built a predictive model, and with the help of algorithms come up with varying answers.

There was an overlap set, in which most of the discrepancies resulted from the first rater consistently choosing neither while the second chose agree or disagree, but they, however, agreed that a more finely grained scheme may have helped reduce those instances. Only very rarely, in 2 of 284 instances, did the raters assign an agree-disagree pair, which according to them signalled a very high degree of consistency in the data.

From your personal experience, give some examples of online communities, where there is very little disagreement between opinions (or disagreement is not tolerated).

1.) E-commerce Website Reviews: This is a recognized area where opinions are understood to be personal, so anyone writing/reading usually just agrees or state their own personal statements. (E.g Shopify, Amazon, Alibaba)

2.) Q & A Platforms: For the same reason that everyone is entitled to their own opinions and experiences, very little disagreement usually arise. (E.g Ask.com, Stack overflow, Quora, etc)

3.) Advocating and Crowdfunding communities: Examples are Gofundme, Change.org etc,)

What features of Facebook wall (or Facebook in general), promote the development of echo chambers?

1.) The Like (now reaction) button: For instance, if there is an ideology on my news feed, and I click the like button, that action goes ahead to notify Facebook, that I agree with that ideology. And it would, in turn, continue to display on my wall, sources/posts that reinforce those ideologies until my entire news feed becomes a closed system, and filled with only what I agree with.

2.) Displaying the number of likes/reaction: The higher the number of reactions/likes to a post, the more credible it is perceived to be. This phenomenon can also promote echo chambers. An example of this concept can occur when taking a poll. If a normal user sees that one voter has more knowledge on a subject topic, he/she is more likely to vote same.

3.) Facebook Authentication (The blue badge): This might not be so direct, but it has its subtle ways of promoting echo chambers

--

--