Can We Trust Facebook to Keep Our “Digital Living Rooms” Safe From Liars, Racists, and Haters?

Omidyar Network
Mar 11, 2019 · 6 min read

By Anamitra Deb, Beneficial Tech Lead, Omidyar Network

Image for post
Image for post

Yet again, Facebook wants to have its cake and eat it too. Last week, Mark Zuckerberg announced plans to build a new offering focused on private conversations. By combining the billions of users across WhatsApp, Messenger, and Instagram, this still-to-be-created product claims to be the future of communication: private, encrypted, secure.

He laid out several principles that would guide this pivot, including private interactions to ensure that communications both sent and received are directly under the user’s control, and encryption, which helps fulfill users’ desires that private communications are only seen by the people to whom they’re sent — not hackers, criminals, governments, or even the platforms themselves.

On the face of it, there seems to be much to like. Not least because many other offerings — from Signal to Telegram and, of course, WhatsApp — already incorporate many aspects of it. And following on from a failed acquisition attempt, it also incorporates Snapchat’s ephemerality, in that users’ digital footprints disappear over time.

But we’re skeptical. For us, this announcement raises far more questions than it answers. Among them:

For what it’s worth, this isn’t even a comprehensive list of concerns. Zuckerberg himself admits that there are tradeoffs to the encrypted, private communications game, noting that there is the very real potential for “truly terrible things” such as child exploitation, terrorism, or extortion. He says, “we will never find all of the potential harm we do today when our security systems can see the messages themselves.” Given Facebook’s grievous historical track record on this point, even on the open platforms where they can see all the content, this should give us all serious pause. What will happen when no one, including the platform itself, can see the private messages that are sent and received?

“What will happen when no one, including the platform itself, can see private messages that are sent and received?”

“A “digital living room,” filled with haters, liars, racists, and organized commissioners of violence should hardly be the social media vision accepted by us all.”

“At Omidyar Network, we believe that in order to keep our “digital living rooms” safe, we must get smart on these issues — and we must do so right now.”

To do that, we need to see at least these three things:

1. How do users actually use WhatsApp?

We have precious little data beyond the total number of WhatsApp users per country. How many groups does the average user belong to? What content do they see, engage with frequently, and what do they do with it? How much of this is ill-intentioned, either because it is intentionally deceptive or hate-filled? If Facebook is serious about safe and valuable “digital living rooms”, it needs to make (meta) data sets transparent. It needs to welcome what civil society and researchers in countries such as Brazil, India, and Myanmar are learning about users’ behaviors and complaints, and get serious about the right mix of technological and governance solutions.

2. Who are the organized perpetrators of disinformation and dangerous speech?

What are their incentives, methods of content creation, and dissemination patterns, and how effective are their campaigns? What have technologists, researchers, and civil society groups learned about promising or effective ways of dealing with this — if any — given that the “open social” norms of content guidelines and company policing cannot work on the “closed social” apps?

3. What’s the framework for the balance between our need for encrypted, private, non-surveilled speech, and safety and protection from disinformation and dangerous speech?

Who should be at the table when working out these critical “rules of the road” and who will police the ensuing balance to ensure that we don’t come down too hard on one side?

Put simply: we need more information about these issues, and an acknowledgement of their scale and gravity. We need transparency and cooperation from the platforms about how private and encrypted spaces are already contributing to dangerous societal trends, and commitments of resources to smarter solutions. We need sustained public pressure from users and their communities.

“If Facebook doesn’t follow through on their claim to make our “digital living rooms” safe, their platforms will become more covert, unintelligible, and dangerous versions of the chaos the flagship platform is today.”

Positive Returns

Working to create a world of positive returns

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store