FIrePhotography via Creative Commons

The Social Web: when will we get it right?

Aw yes… the social web. The epitome of our current world; society’s latest revolution as many call it. A revolution with various pros and cons. An innovation with many praises and, as of late, twice as many complaints. If you have been like myself, you have noticed the recent increase in flack about the state of the social web. I am not just talking about Twitter’s struggle to make shareholders happy or Reddit’s recent increase in ads. There is something fundamentally overwhelming about our social media platforms. After reading an excellent observational piece by umair haque on Why Twitter’s Dying, I was inspired to add my 2 cents to the mix. Umair points out that “… abuse is the greatest challenge the web faces today.” For those who didn’t read Umair’s essay, let me summarize what he means by “abuse”: it is not abuse in the sense of cyber-bullying or even trolling (although trolling is an issue of its own), but “endless bickering, the predictable snark, the general atmosphere of little violences that permeate the social web.” We all have experienced this; our social networks are filled with back and forth arguments over half-truths and this hefty amount of bickering is overwhelming. Why? I think there are two primary underlying system level issues that cause this abuse/bickering to be so overwhelming:

  1. The increased polarization of users in our networks
  2. The increased duplication of information in our networks

Polarization: The issue of polarization has popped up quite a bit since Facebook researchers released the paper “Exposure to ideologically diverse news and opinion on Facebook.” Zeynep Tufekci wrote a great commentary on the paper that is worth the read. In summary, Facebook’s newsfeed algorithm hurts user’s access to ideologically diverse (or “cross-cutting”) content. Users will only see what other like minded users share. The conservatives will get mostly conservative post (Fox News, etc.) and liberals will get mostly liberal post (Huffington Post, etc.). Of course, if you are a super balanced person with a good mix of liberal and conservative friends, and you click a post from each viewpoint with equal probability, you may see a more balanced set of posts. There are several other debates about the paper, but I will not get into those now. Point being, it has been shown that we see ideologically like minded posts/news on Facebook due to the algorithm.

So, whats the big deal? We don’t want to see those gosh darn liberal (or conservative) posts anyway…Well, these curating algorithms are not necessarily doing us a favor. It has been shown that limited exposure to “attitude challenging” information causes people to become more split in viewpoints over time and causes users to “misperceive facts” about events currently happening. (A 2007 paper by Stroud called “Media Use and Political Predispositions: Revisiting the Concept of Selective Exposure” hits on this and this paper).

Although not all popular social networks use algorithmic curation, this can still cause divide on the social web as a whole. External information flow into a social network is likely coming from a curated source. What I mean by this is that the social web is highly coupled; users will likely get at least a portion of the information they share in one network from another network (eg. sharing something on Twitter from Facebook). Twitter is currently just a LIFO stack of information, not curated (although there are rumors it will be algorithmically manipulated soon), but increased extreme views in one network may be carried over to a different network. This idea of one network’s information affecting another’s is not empirically tested (at least to my knowledge), but according to some 2014 Pew Research Data, “a notable number of internet users exploit both Facebook and Twitter platforms.” Thus, I would think information is bound to leak from one platform to another, therefore polarizing much of the social web. Interestingly, independent of the social web, it has been hinted at by another Pew Research study that we could be becoming more polarized in society as a whole. Hence, maybe all our networks are becoming more divided… Overall, this idea of social networks becoming more divided can be thought of as fuel for the abuse umair haque talks about in his essay… making online abuse more overwhelming and prevalent.

So, lets think of it like this: if the social web is a house and our individual social networks are rooms in that house with a bunch of great stuff in the rooms (quality information), but there is a fire (of bickering users) blocking the door into the rooms. We have to wait until the fire dies down to go inside and get our cool stuff. Sadly, users keep throwing gasoline on the fire (because they have to defend their extreme view). So, what are our options? We can work really hard to take an alternate route into the room of the house, like breaking a window (manually filter through the garbage in our news feed) or we can just leave the house to burn (leave the social network, stop contributing quality information).

OR the social networks could be built more fire proof…

Duplication: Another reason this online bickering seems so overwhelming is information duplication. Information duplication’s affects on networks is something my research group is currently studying, so I will briefly mention why I think this important and not spoil the details of what we have found just yet… Information is becoming available from an increasing number of networked sources instead of a few news sources that were commonly used in the past (the good ol’ days of print, etc.). This increase in competition between news/information sources leads to increased duplication of information within the network. We have all seen this, whether it is Conservative Daily taking something from Fox News or BuzzFeed taking something from Reddit (Reddit users you know what I am talking about!). Because of this, users are getting overloaded, and not necessarily with quality information. If this information is becoming increasingly ideologically extreme (maybe even… mispreceived/half-truths?), we are going to see this online bickering and abuse at an overwhelming rate. And if users catch on, they will leave (leave the house to burn), or at least those users who want/produce quality information… which will only continue to decrease quality… TLDR: duplication of information, specifically polarized information, will cause a downward spiral of doom for the social web… maybe I am exaggerating a little, maybe I am not. It could end pretty badly…

In conclusion, both the polarization and duplication of information are causing this perversion of quality social interactions and quality information. Notice that these two problems are not just social issues, but system issues. Our social network platforms are not properly facilitating iterations and information… and users will start leaving if it continues. Companies are focused on monetizing our social networks rather than look deeply at what their platforms are doing to society. We are extremist. We fight for our beliefs without considering others. And we love to do it behind a computer screen… I do not necessarily have the answers to these problems (at least right now), but I believe we can fix them. or heck… maybe this is just something about society that our tools have pointed out to us. In any case, we need to be thinking about our systems effects on society as a whole, and if businesses in the social web game want to survive long term, they need to be thinking as well.

Benjamin D. Horne

October 26, 2015

Troy, NY