What’s the Problem with Facebook?

Silas Rickshaw
5 min readDec 7, 2018

--

The Facebook Dilemma is a two part series produced by Frontline and was broadcasted in October of 2018 on PBS. The first part explains the history of Facebook from the beginning up to the 2016 United States presidential election. The second part dives farther into the election and then discusses what has been happening all the way up to when the show was featured in 2018.

Facebook was founded at Harvard university by Mark Zuckerberg. It was initially meant to connect college students, but quickly expanded far beyond just college students. The mission statement that Facebook has had from the beginning is to “connect the world”. In Zuckerberg own words, the mission is to “give people the power to share in order to make the world more open and connected” while also making Mark and the investors filthy rich, with everyone everywhere being able to communicate with just about anyone. Facebook has held to this mission throughout its growth, and grow it did. Facebook went from colleges, to high schools, to junior high schools, to every person over thirteen. Soon it would expanded outside of the United States, going from country to country gaining users and, as they hoped, connecting more people on a single platform than any before and bringing in revenue at an insane rate. Unfortunately Facebook’s slogan, “Move Fast and Break Things,” would also come to pass. Facebook has been focused on expansion for much of its existence. The company made features that would draw in new users and keep old users on. Facebook was translated in to hundreds of languages. Facebook grew from Harvard to the world in less then two decades. Facebook did move fast.

Facebook, like most tech companies, added features to their platform. The Like button allowed Facebook to gather information on what its users want to see quickly and efficiently. The button made it so that the ambiguous comments weren’t the only things that Facebook could keep track of. This data was used to figure out what you are interested in. The Likes and commenting content is then run through an algorithm and that indicates these people liked this feed, maybe these people will also like that information. It then projects topics and posts that it thinks you will like into your News Feed. It may have started with just that, but now it has gone much further. Facebook has fingers that collect data from everywhere. Anything on your phone or browser may be tracked and collect by Facebook. All of this information is then used to put more content that you like in front of your face. Over the years there have been several people that have confronted Facebook with the complaint that it works to well. Let’s say someone posts something that is awful and untrue. By this time the algorithm already knows that there are people that will like it. This then becomes a group of extremists that are just snowballing down gaining numbers and awfulness. This process has been connected with protests, revolutions, mobs, murders, and genside. For example, in Myanmar an angry mob was formed by an untrue post about an innocent man. The man in question was a Muslim, a minority in the highly Buddhist country, and had allegedly raped a buddhist woman. Two people died in that mob. Thousands were displaced by the mobs that follow. All that was need was one excuse from the Buddhist, a little push, and the social stability of Myanmar collapsed. Facebook also allows people to target others very easily in two ways. First: if someone does not like what someone else is saying, they can simply get on Facebook and proceed to humiliate, discredit and dehumanize that person publicly. Second: Facebook does not just use the collected information for what it puts in your newsfeed, Facebook also “sells” the information to other companies via targeted ads. All the company has to do is fill out a quick questionnaire and Facebook will make their ads show up to the most likely people who buy the product. Facebook did break things.

Another problem would arise later when the news industry decided that they wanted in on the Facebook growth. There really was not much you had to do to get into Facebook as a news publisher. This opened a window for illegitimate news websites to get a large following while posing as a news agency. People would use this opening to put fake news online. People were drawn in to this fake news. One of the more circulated pieces of fake news was that the Pope endorsed Donald Trump. This is completely untrue, but it got a better following then when The New York Times posted a story on Trump’s tax returns. These fake news sites crafted the news to get readers on their site. All that was needed was a phony story, a website and a few ads ons the website, and then they were making money, thousands even hundreds of thousands.

The 2016 presidential election is what really brought this to a broad knowledge publicly that Facebook might have something wrong with it. The U.S. investigations dug up some interesting finds. It appeared that Russia wanted to play around with or manipulate the US outcome of the elections. They have traced several accounts that are somewhat toxic back to Russia. These fake accounts were made to inflame a side of a controversy. Another fake account, would then agitate the other side. These would then escalate tensions between the factions. This seeding of discontent was placed by Russians who did not care for the arguments and were never really involved in anyway with the issue.

Section 230 in the communication decency act says that tech companies like Facebook are not responsible for what their users post. This means that Facebook had to quickly decide what it would allow and what Facebook wouldn’t allow on its site. There were not any set bounds on what Facebook can and can not take down. There still is not. Anything that Facebook doesn’t want up they can take down.

Now whether this is a problem with the people who uses Facebook for malice intent, the gullibility of the users, Facebook’s lack of policing, or if there is a problem with the algorithm that polarizes people is up to each of us to decide.

Now the one question no one knows the answer to: “What do we do?” Banning people from Facebook may work, but they could make fake accounts faster than Facebook could take them down. Eliminating those who may be inclined to join these radical pages would be unfair. Another option is to totally shut down Facebook. This would have major ramifications. Basically every country that has the internet has Facebook users. This would be punishing the whole for the actions of a few. If Facebook restricts what is on Facebook, then we still have a problem. Where is the line drawn for free speech? Arguably this will make more of a problem than there already is. Another option is taking down inlagetament users. This has similar problems as taking down specific posts. The honest truth is that we have never had to deal with this sort of thing before. It is a new platform, with a connection on a scale that is incredible. We just do not know what to do.

This is a problem that is global and major. People’s lives are at stake. Currently this command chain of events relies on already existing contention, with time these conflicts could be made artificially. We have already torn apart countries.

The honest truth about it is that we just don’t know how to solve the problem. There may be no one solution to the problem.

--

--