Facebook Decides What Comments Are “Most Relevant” for You. Censorship, or Just a Loving Social Media Partner?

Harry Cardillo
6 min readAug 13, 2021

--

When someone posts something publicly on Facebook, all of the comments posted under that post are, by default, set to “most relevant.” This is true for each and every post on Facebook, no matter who has posted it. Of course, Facebook decides what is “most relevant.”

In the past, from what I can garner from an internet search, a user could change this setting to reflect the fact that they wanted to see “all comments” by default, and not just those Facebook deems “most relevant.” Under the current dynamic, if you want to see “all comments” on a post, you have to manually change the setting on each and every post you view. Which, I would argue, certainly deters people from actually doing just that. Why “all comments” is not the default setting, and why a user can’t change the default setting to their own preference, seems nefarious. Especially give the climate of Big Tech censorship that many of us are seeing and experiencing. Here, I will examine what Facebook claims are its motives in setting “most relevant” comments as the default, in the context of the method it uses to make such a determination.

Two years ago, a Facebook user by the name of Mark asked about this very issue. Here’s Mark’s question:

https://www.facebook.com/help/community/question/?id=10157158719799621

And here is the answer from someone named Emma, on the Facebook help team:

https://www.facebook.com/help/community/question/?id=10157158719799621

As you can see from the answer, Facebook alleges to use feedback from its members to enact these types of polices. In addition, on its “Company Info — About Facebook” page, under an article entitled “Making Public Comments More Meaningful,” Facebook claims to be using “surveys” to find out what kind of comments people want to see:

https://about.fb.com/news/2019/06/making-public-comments-more-meaningful/

This alleged use of a feedback loop and surveys seems to be entirely contradicted by the 216 answers (actually comments) posted under Mark’s question. I scanned down all of them, and not a single comment said anything like, “I prefer Facebook to determine which comments are most relevant for me.” In fact, just the opposite is true. Here is one of many examples:

I don’t know what “survey” Facebook took, but just going off the other (currently) 216 answers under Mark’s question, one could surmise that most Facebook users don’t want Facebook to help them determine which comments they see. They want to be able to see all comments and decide for themselves what they believe is “most relevant.”

Of course, maybe Facebook is just a loving social media partner. One who is really just trying to “promote meaningful conversations” for its users. Or is something else going on here, as this answer under Mark’s question seems to suggests?

If you describe something as “meaningful,” what you mean is that it is important or useful in some way. So, this default setting would be fine if the comments were actually ranked by the number of likes or interactions. Which would push up to the forefront those comments which are most “meaningful” to the users interacting with those comments. However, Facebook added a subjective element into their comment “ranking.” Facebook uses something they call “integrity signals” as part of their raking system. What this entails exactly is an unknown, because what Facebook says this entails is vague at best:

https://about.fb.com/news/2019/06/making-public-comments-more-meaningful/

The bottom line is, Facebook claims that it wants “people to see safe and authentic comments.” Clearly what Facebook means by this is that they want users to see comments it believes are “safe and authentic.” Safe and authentic in the context of a comment, or speech itself, can be objective. Certainly, untruthful words can adversely affect a person. Especially if someone who hears or reads those words relies upon them to their detriment. You can’t yell “fire” in a crowded movie theatre that’s not on fire because it could unnecessarily injure people fleeing for the exits. Free speech is not without restriction, even in the United States. And so, weeding out false information, and limiting people’s exposure to that information, is a very valid purpose. One that a company like Facebook could certainly enact. However, in the context of an opinion, weeding out what someone deems to be “authentic,” or more succinctly what they deem to be undisputed, serves no valid purpose other than to censor that opinion. Opinions can never be undisputed as they are, by their very nature, inconclusive . Notice how Facebook did not say, “we want people to see truthful comments.” Which would serve that very valid purpose, but also limit their analysis to those comments which actually made some factual claim. Unfortunately, under its current system, Facebook is not simply fact-checking the facts, or scanning for untruths, and deleting them. Facebook is subjectively quantifying opinions using a subjective analysis of what it deems to be “authentic,” and, as a result, making their accessibility more difficult. They don’t delete opinions they deem to be unauthentic; they simply bury them. A method which, in effect, censors them.

How much weight is given to the subjective criteria and how much is given to the other criteria is unknown. However, I have personally reviewed comment threads where a comment has a significant number of likes, or even the most likes, and was not ranked as “most relevant.” Which would seem to suggest the subjective criteria, i.e., Facebook’s interpretation of the authenticity of the comment, is weighted much more heavily than the other listed criteria.

One odd side-effect of this “ranking” system is that if you are notified of some interaction with one of your public comments, for example a response to your comment, and your comment has been buried, when you click the notification you are taken to the thread but not the actual interaction. You have to actually find and unbury your comment manually to get to the response. This result certainly makes having a meaningful conversation much more difficult than if you were taken directly to the comment. And clearly contradicts Facebook’s alleged “we’re making this more meaningful” purpose.

While debates rage on about Big Tech censorship, we, the users who care about free speech, are currently left with absolutely no recourse when a company like Facebook decides to censor opinion comments based upon its determination of what it deems to be “authentic.” What we need are our representatives to take the initiative and help put an end to this “we will decide what you see” social media censorship.

--

--