Truth

Dev Chakraborty
Ideas and Words
Published in
2 min readNov 17, 2016

Since the election, Facebook (along with Google and President Obama) has declared a war of sorts on “fake news”. Here’s Zuck:

Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. … That said, we don’t want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here.

— Mark Zuckerberg, CEO, Facebook

I don’t doubt the veracity of what Mark said here (and in the rest of his post on his timeline), but I can’t help but feel like this raises more questions than it answers.

The First Amendment asserted that everyone should be allowed free speech within reasonable limits. The common example of a reasonable limit: you can’t yell “fire!” in a crowded theatre (that isn’t actually on fire). Basically, you can say whatever you want so long as it doesn’t put anyone else in danger.

What constitutes danger, though? The example where physical harm might occur is pretty clear-cut, but what about other kinds of damage? Should we expand the scope of danger to include blatant misinformation? If so, what else should it include?

There’s a grey area here, as Mark himself acknowledges. The truthfulness of an article isn’t always an objectively measurable thing. It’s possible to make bad arguments with great data and assumptions. The reverse is possible as well. That being said, it’s not hard imagine a solution that establishes some sort of benchmark for what makes a news story “fake.” I’m optimistic that what Facebook is trying to do here is doable.

If we could all agree that fake news is unacceptable on social media, I think that would be a great step forward and would eliminate a lot of noise from the discourse on important topics like politics and science. At the same time, I think us having that agreement would be wildly inconsistent with other things we continue to accept every day, such as inaccurate/sensationalist reporting by cable news outlets, misleading advertisements for shoddy products, and outright lying by corrupt politicians.

At the end of the day, I think the truth is that humans, at their core, don’t care about truth or logic. The poignancy of a news article holds far more weight for us than its factuality. I don’t have numbers to back this up (ironically), but I would bet that the <1% of content that Mark says is fake is engaged with by Facebook users more than the factual content.

That’s a depressing thought. But maybe Facebook’s efforts will help ground us in reality, something that feels sorely missed in today’s media climate.

Follow Ideas and Words using the button below for more like this!

--

--

Dev Chakraborty
Ideas and Words

Indo-Canadian-American. CS student & sriracha enthusiast.