The Need For Social Media Regulation

Tom Foale
#NoDust on Brexit
Published in
4 min readSep 10, 2018

There is something incredibly dangerous about social media, much more dangerous than previously understood. What makes it dangerous is self-reinforcing feedback, and what it means is that social media bears a much greater responsibility for Brexit, Trump, increasing racism and the rise of both the far right and the far left than previously thought.

Guillaume Chaslot worked on a YouTube algorithm that used the time a user spent watching a particular video to make recommendations to that user which increased the time spent watching videos, and therefore maximised YouTube’s advertising revenues. Guillaume tweeted:

This algorithm drives a positive feedback loop and unfortunately mere transparency does not help much. If you like puppies and kittens then YouTube will show you more adorable pets. However, if you are a closet extremist then YouTube will show you videos that support and amplify your extremism, normalizing this behaviour, and the more extreme the videos you watch the more you will be shown even more extreme content.

Feedback is everywhere in our lives. It controls our bodies, our utilities, our money supply, our weather, our governments, our behaviours through social interactions. Its impact is also poorly understood. Feedback plays a big part in economics and leads to booms and crashes, but is barely studied.

There are two types of feedback loop. Negative feedback ‘feeds back’ a portion of the output signal to the input, acting in the opposite direction to the input signal. Negative feedback is responsible for stability and control. It’s what keeps our body temperature stable, our speed constant when driving and our money stable relative to the goods and services we buy. Here’s a simple negative feedback loop for driving a car at constant speed.

Negative feedback loop

Positive feedback amplifies small input signals. It’s what is responsible for tornadoes and hurricanes (“the butterfly effect”), mechanical devices shaking themselves to bits, as well as the spread of panic responsible for stampedes in cattle and runs on banks like Northern Rock. Positive feedback is useful, being responsible for contractions in childbirth, the exponential growth of companies and the oscillators that drive our electronics, but can be extremely destructive. Positive feedback tends to drive things to extremes.

In many areas negative and positive feedback loops interact, leading to extremely complex behaviours. Here’s a good example of the self-reinforcing interacting feedback loops that led to the exponential growth of Apple’s smartphone.

Feedback is used to groom extremists. Take someone’s natural indignation when shown an injustice, then deliver more and more information that feeds that anger and blocks rational thought. It creates believers that will not easily accept counter arguments based on logic and evidence — particularly when told that those with other views are trying to hurt you in some way.

The same mechanism drives Brexit. How many of those who still think leaving the EU is the right thing to do use the words ‘I believe’ when talking about their views? All of them. They can’t support their case with evidence, nor give real benefits of leaving, but they don’t believe any expert’s evidence either. Their minds can’t be changed. It’s a form of extremism.

As this diagram shows, in which every feedback loop is positive, it is in YouTube’s (and every other free social media’s) interest to maximise the divisive content it hosts, because that drives additional content and income from advertising. There are few controls on what social media can host, because of laws on freedom of speech. The only control on the content we absorb is our own moral character.

Social Media Feedback Loops

What YouTube is doing is amplifying social division and, almost certainly, contributing to the rise in mental illness and violence. We need to break this positive reinforcement loop. Like pollution, social networks impose a high cost on society while the network operator benefits. This is a type of market failure called a negative externality. Just as we use regulation to ensure that the polluter cleans up their own mess, we need regulation of social media to eliminate this externality.

As a society we already control content of a sexual nature on social media platforms because people find it offensive. There’s no reason why we shouldn’t similarly control violent, extremist or racist content. However, we don’t have to ban it. We can use nudge theory to reduce positive feedback.

YouTube and Facebook should put contributors of extremist materials into ‘Pay to view’ categories. Users would have to register with the site using their personal details to view this content. Payment by anonymous means, such as Bitcoin, would be banned. This would help keep youngsters from radicalisation and would deter users from viewing the more extreme content, knowing that their viewing was no longer anonymous. It would also let content providers know that their content was too extreme — to increase their income it would benefit them to keep it out of the “pay per view” area.

However, in the same way, social media COULD be used in a positive way, to improve society’s general mental health. By detecting people with potentially harmful thoughts and promoting content to them that widens their viewpoint — negative, not positive feedback — we could have a much safer, more tolerant and happier society. This is a positive side to social media that could, with the right regulatory environment, be of great benefit.

--

--

Tom Foale
#NoDust on Brexit

Troubleshooter, problem solver, innovator and out-of-the box thinker. Owner/CTO at Klaatu IT Security.