Co-authored by Woodrow Hartzog and Evan Selinger


Turns out, you can’t create an idyllic global village by fetishizing connecting people. In the most recent chapter of the book on why Silicon Valley still doesn’t get it, the New York Times recently reported that Facebook “removed 66 accounts, pages and apps linked to Russian firms that build facial recognition software for the Russian government.”

It’s unnerving, to say the least, that prime suspects SocialDataHub and Fubitech might have scraped enough user profiles to create “a mirror of the Russian portion of Facebook” so Putin could repurpose the information and weaponize it for authoritarian surveillance. From fake news to state-sponsored tyranny, we’re living through a moment where all kinds of bad actors have their eyes set on Facebook’s treasure trove of personal data. Even assuming the platform was ever reserved for our “friends,” it isn’t anymore.

Right now, users have little choice in the public exposure of their profile pictures. Every single one of them is set to “public” by default.

The Times writer wasn’t editorializing, so the coverage overlooked a critical part of the story. Facebook’s appetite for connecting users led the company to build its system in a way that made this kind of mass exfiltration of user data inevitable. Indeed, the platform could have gone a long way toward preventing this sort of thing from happening, just by making the single decision to protect users’ current profile pictures behind default privacy settings.

Right now, users have little choice in the public exposure of their profile pictures. Every single one of them is set to “public” by default. Even if you try to limit your current profile picture visibility using Facebook’s privacy settings for the individual photo, it will still be public. If you don’t want your profile picture to be public, the only winning move is to delete your account. That’s increasingly difficult to do these days, because not having a social media presence can limit your personal and professional opportunities and even raise the suspicion of authorities.

As a result, the company forces us to be vulnerable to hazards we should be protected from.

Zuckerberg and his team know full well that dangerous people and menacing organizations are tempted to extract the data, terms of service be damned. For these folks, our profile pics are an irresistible treasure trove.

It’s easy to miss the importance of Facebook’s policy on profiles and other mandatorily exposed information. After all, it’s not an obvious danger of modern Internet usage, like when hackers access passwords stored in clear text. Facebook manipulates what we see in many ways, including with carefully orchestrated press releases and talking points that focus our attention on the options the company deems worthy of providing.

After taking yet another reputational hit from the Cambridge Analytica scandal, for example, Facebook touted the new tools it created. Ta-da: Now users could more easily see and access some of their data, and adjust some of their privacy settings.

Like a good magic trick, however, this move focuses our attention on one thing and away from another. It obscures the privacy lapses hidden in the options that we aren’t given.

Obscurity means that the harder it is to find personal information, the safer the information is from people we don’t want accessing it.

It might not seem like there’s a very big difference between having your profile picture and other commonly shared information be “public” and restricting its visibility to people to whom you grant explicit permission. After all, in the age of hyper-visibility and ubiquitous personal branding, people share pictures in a hyper-public way all the time across platforms, and not just on Facebook, but on Instagram, Twitter, and elsewhere. But there’s the rub: The difference between profile pictures and all the other photos of us online may seem small in isolation, but matters a great deal at scale. And it all has to do with the importance of obscurity — an important aspect of privacy that concerns the likelihood of others locating our personal information and being able to understand what the words or images mean, given their context.

Among other things, obscurity means that the harder it is to find personal information, the safer the information is from people we don’t want accessing it. These transaction costs significantly influence what people are willing to do, and the harder a bad actor has to work to access our data, the more persistent and determined the person has to be to accomplish her mission.

Sadly, the issue isn’t that the folks at Facebook are ignorant about how obscurity works. We know they get it, because sometimes Facebook’s user interface is configured in ways that directly acknowledge the importance of enhancing it. For example, Facebook allows users to click on an option for preventing search engines from linking to their profile information, and the primary value of this privacy setting is that it increases the transaction costs for others to find it. Like everyone else, Facebook knows that if digital information isn’t available on Google, it is less likely to be found by strangers.

If Facebook allowed our current user profiles to be obscure by default, it would introduce friction into the digital ecosystem. Such friction wouldn’t create ironclad protections, but it would frustrate automated attempts to crawl the most popular name/face databases that feed the data surveillance and facial recognition machine. Giving us this added peace of mind would be no small thing.

Facebook is already aware of this problem. In India, the company is experimenting with some modest technological protections for profile pictures, such as limiting the ability for people to tag and download photos, and giving users the option of adding a blue shield and border around their profile image as a “visual cue of protection.” The company is also testing the option of adding a design overlay to the images to make it less likely for them to be copied.

These are useful features, at least when compared with the situation in the rest of the world where they just don’t exist. But when their merits are considered relative to all possible options, it’s clear that they’re only half measures. If the features aren’t activated by default, users can take action only if they happen to recognize the problem on their own.

Let users hide everything.

Either way, these tools will provide us with less protection in the next phase of the hacker’s arms race than if Facebook made the right call and empowered us to prevent our profiles from being “public.”

The folks at Facebook might even say they’re doing users a favor by allowing us to see a profile pic before sending a friend request. They might claim the site doesn’t want users to mistakenly invite the wrong person and thus expose their sensitive information. But that’s just not compelling enough. People have a reliable measure of verification by using the “friends in common” tool.

Finally, Facebook’s aggressive use of the people you may know feature suggests that it is more interested in expanding connections than taking obscurity seriously. This feature constantly pushes potential friends our way, and seeing their profile pics makes it easier to click on some of them. In other words, Facebook’s interest in capturing our social graph for further data harvesting shouldn’t continue to weigh more heavily than privacy and security concerns. And yet it does.

Facebook’s policy on current profile pics helps demonstrate that it’s still not taking privacy seriously enough, despite the controversies. Consequently, we would like to offer the executives at Facebook and every other social media platform a true privacy challenge. It’s easy to implement and could be done in nearly no time at all — no additional meetings with politicians required!

Simply reverse course on mandating public profile pictures. Let users hide everything. Don’t just push for federal privacy laws that seem destined to be watered down or duplicative. Show the world that you can do better, even when it comes at the expense of your bottom line.


The authors would like to thank Joseph Lorenzo Hall at the Center for Democracy and Technology for his help with the complexities of this issue.