Inside Facebook’s Ongoing Fight Against Political Propaganda
Earlier this year, Facebook promised to crack down on fake news and political propaganda. The company has been blamed for a lot of the political polarization in recent elections in the United States and Europe, and company officials have vowed to take action.
It’s been a bit of a process, though. After the US Presidential election, Facebook founder Mark Zuckerberg scoffed at the idea that his social network could change the course of US politics. Users and pundits and election watchers were not amused. Apparently, he’s not laughing now either.
According to the Associated Press, Facebook is (finally) admitting that governments and other bad actors are using the platform to change political opinion and influence election outcomes. But they’re stopping far short of culpability or responsibility for any changes. And that’s really what critics are looking for. People don’t want Facebook to be used to manipulate people, but that’s a tough subject to tackle without creating a butterfly effect of even more negative outcomes.
The company rolled out passive content controls, allowing users to flag or block content they didn’t want to see. Unfortunately, that just concentrated the partisans and increased the enthusiasm behind creating and sharing propaganda. Facebook hesitated to go any further with its controls … even after fingers were pointed in Europe after Brexit and in the US after Trump.
Now, Facebook is creeping closer to a solution that actually might make a difference, even as it flirts with the line between freedom of expression and partisan censorship. Facebook’s chief security officer told the AP they will be “monitoring” the efforts of “those who try to hurt civic discourse” using the platform.
According to the report, Facebook says: “”(We) have had to expand our security focus from traditional abusive behavior, such as account hacking, malware, spam, and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people…”
The company is also working to locate and eliminate fake accounts and those responsible for spreading fake news. And that’s really the rub. People want to know who gets to choose what news is fake and what is just partisan. No matter what action Facebook takes on this issue, someone is going to cry censorship. If enough people can be gathered beneath that banner, the social network stands to lose users and, then, revenue.
So, from a PR perspective, Facebook has to walk a fine line. There’s no doubt they need to do something to curb abuses, but they can’t step on too many toes in the process.
William Doonan is a tax law and legal expert in New York.