Do You Trust Facebook To Judge Your Integrity?

The monolith of deception is asking you to trust them, again

Ryan Ozonian
Private Parts - by Ryan Ozonian
4 min readAug 29, 2018

--

Are you trustworthy? That’s the question at the center of Facebook’s new rating system that assigns you with a trustworthiness score on a scale of 0 to 1.

“A user’s trustworthiness score isn’t meant to be an absolute indicator of a person’s credibility,” Tessa Lyons, the product manager who is in charge of fighting misinformation at Facebook told The Washington Post. “Rather, the score is one measurement among thousands of new behavioral clues that Facebook now takes into account as it seeks to understand risk.”

If you’re thinking…

Facebook — the monolith of deception — judging my honesty is a bit like tasking a two year old with deciding how much soda is too much soda. (Chances are their answer will be there’s no such thing as too much soda.) Well, somehow, Facebook sees it differently. But let’s for a moment give Facebook the benefit of the doubt and assume that their new rating system is actually meant to fight against fake news and false information. Does that make it more acceptable?

Not really…

See, the larger problem with assigning user profiles with a trustworthiness score or any score for that matter is that it greatly underestimates the extent to which a person’s user profile spills over onto their actual life. The purpose of the rating system is a specific one but what’s stopping the same sort of technology or algorithm from extending itself into other parts of your online life and from there your actual life? What if a bank or a car dealer wanted to translate your trustworthiness into a creditworthiness score?

If you think that’s absurd, you may want to reconsider. Because in China, this ostensibly fictional “Black Mirror” episode is anything but fictional. Since 2014, the Chinese government has been developing a much broader “social credit” system partly based on people’s routine behaviours with the ultimate goal of determining the “trustworthiness” of its 1.4 billion citizens. If that sounds like the dystopian nightmare I’ve already written about, that’s because it is. According to Wired, social credit is already preventing people from buying airline and train tickets, stopping social gatherings from happening, and blocking people from going on certain dating websites.

Yet another issue with Facebook’s trustworthiness…

Yet another issue with Facebook’s trustworthiness score is that it’s been developed in a black box. In other words, executives at Facebook won’t explain what’s behind their trustworthiness algorithm because they’re afraid if people knew they would try to game the system. And while that may seem like a perfectly viable explanation for keeping information from the public, it’s unacceptable when that information is being used to tell us what sort of people we are. We have taken a different approach when we launched our Mercury Protocol project and we think that leveraging the crowd is a much better model than a black box. You can’t fake being a “nice guy” when everyone knows you’re actually an asshole.

A good rating system shouldn’t be gameable.

Everyone should be able to judge your character based on key factors like not lying, being generous, etc., which you can’t fake. If Facebook can’t figure out a way to apply these principals to their own digital space, they shouldn’t be trying to classify our trustworthiness.

In fact, it’s unequivocally dystopian for our technological overlords to develop an algorithm that they’ll use to determine whether we’re honest, decent, or lovable. Not to mention the fact that Facebook already uses our data to categorize what sort of people we are as it aims to assess precisely which ads to serve us.

My point is…

To say that I’m apprehensive in believing that Facebook has good intentions in this next iteration of their road to regaining the public’s trust is an understatement. I too believe that fake news and false information is a serious issue that requires a serious solution. And for that reason, I applaud Facebook’s proactive approach. I’m just not sure I trust the vehicle of proliferating false information to decide whether or not I’m the one who’s full of shit.

Follow Me

Twitter
LinkedIn

Practice Safe Texting — Use Dust

Dust Website

--

--

Ryan Ozonian
Private Parts - by Ryan Ozonian

CEO & Co-Founder of Dust Messenger — passionate entrepreneur building a new digital world based on trust