Facebook’s not giving US users enhanced privacy standards, and why their decision might be a good thing

Despite what Cambridge Analytica did, it’s important that we take time to fully think through what happened and how it gets regulated, not implement knee-jerk reactive punishments just because the world is on fire.

Joe Toscano⚡️
RE: Write
Published in
7 min readApr 5, 2018

--

This week we saw reports that Facebook is installing GDPR standards into their platform, but only for users in the EU. Users in the United States, Canada, and everywhere else will keep the same standards they’ve always had. This could create an incredible cost to Facebook who will now have to manage radically different privacy protocols across countries around the world, while still making sure that they don’t slip up by using the wrong tracking system on EU citizens that might be living (even temporarily) in other countries.

However, Facebook also has a very forward-thinking reasoning when it comes to keeping standards they way they were. This is because they realize that these issues are much more complex than most lawmakers realize.

The problem with what’s going on is that these issues are incredibly complex but because the public is in an uproar, policymakers feel like they have to quickly jump on a fire they don’t fully understand how to put out. This puts them in a poor position to protect the public. And because of this, policymakers are reactively creating knee-jerk regulation just because their world is on fire.

Right now policy experts (and loud mobs of angry people) tend to believe that we need to make these systems give us more control of our data and more transparency about the way our data is being used by the system. However, their argument, while made with good intentions, is generally uninformed about what is actually going on.

For one, in the call for more transparency it should be known that most of the sites being examined have given consumers the ability to control their data and wipe it clean for years now. Right now you can get access to nearly all of the data Facebook owns on you by following these steps. The same is being done (and has been done) by Google, Microsoft, and many others. Or, as mentioned last week, check out this thread from Dylan Curran.

The thing is, most consumers really don’t care until something bad happens. And even after something bad happens, they generally go back to not caring in a short amount of time because the public has become desensitized to these issues thanks to an incredible amount of breaches over the past couple decades.

In order to protect consumers from themselves, policymakers seem to think creating regulation that forces companies to provide contextual data agreements and make use more transparent is the best solution. In this process they’re also pushing against Facebook to be more responsible with who they give access to this data to. This demand for a restriction in free data exchange could very well push the industry down a path that may end up making it harder to openly share data between systems — something we need these big companies to continue doing.

What happened with Cambridge Analytica was bad, but we have to remember it was legal and has been done by millions of organizations across the world. It’s called advertising. And we’ve had targeted marketing (in the early form of direct mailers) since the Nixon campaign.

If you’d logged in anywhere with Facebook before, every one of those orgs probably have and/or had the same level of access as Cambridge Analytica. But they (so far) haven’t been malicious with that data. And in the frame of the US election, the Clinton campaign could have done the same thing the Trump campaign did. They just didn’t. It doesn’t make the behavior of the Right any more or less malicious, it just shows the lack of technical chops the Clinton campaign carried with them.

So, to really understand this issue and how the blame should be placed, we have to consider this differently:

When someone makes a bomb and harms others, we don’t go blame the companies that sell the parts for the bomb, we blame the bomber.

In the case of FB and other Internet giants we need free, open exchange of data. Maybe we make it harder to get some specific types of info, maybe we make it harder to get specific combinations of info, maybe there are steps to get those more sensitive details — we have regulation in place for guns and other weapons we can look to for parallels. But in general, companies need data from these big corps to thrive. And for this to happen, we need free exchange of data. No small or medium size business could ever stand a chance at getting access to this much data alone, so we need big corps to be allowed to share their data. We can not strictly shut it off, because that will be just as, if not more dangerous.

These corps have become the oil companies to our modern world and if we make legislation that forces them to strictly protect the data we may end up reinforcing their monopolies.

So going back to the Cambridge Analytica case, when assessing blame we shouldn’t blame big corps (the companies that make and sell the bomb parts) for the way malicious companies (the people who ignite the bombs) use the data. Instead, we should blame the people who made the bomb. We should punish those people. This being said, while policymakers aspirations are all noble, in theory, they may actually end up backfiring and making the market more dangerous, in practice.

HERE ARE SOME REASONS WHY:

1) Research shows that when systems make consumers feel like they have more control over their data consumers are actually willing to give up even more sensitive data than before because they believe they have control over it. But when consumers don’t understand how data can be weaponized and when its nearly impossible to guess how a company is going to use a specific data set or combination of data sets, this actually could backfire on regulators by making data collection easier, more intimate, and, ultimately, more dangerous to the consumer.

Expecting better privacy controls to make the Internet safer is like expecting food labels to make processed foods less fattening. What we really need is better consumer education and data literacy.

2) These laws could also end up stifling innovation because small and medium sized companies will be forced to operate with a global scale in mind from the outset. This just isn’t realistic for everyone and could end up forcing small, nimble companies to become slow, archaic systems that are building to appease regulators rather than spending their limited time and capital on creating a great product.

3) Between stifling innovation and making it harder to share data, these regulators may also end up reinforcing data monopolies and giving them the right to build even greater moats to control the market. If they’re forced to keep the data in control, they literally will be regulated into a position in which owning a data monopoly will be legislated. This would be the worst of them all.

And in fact, that’s exactly what we’re seeing happen as Facebook is now working to control 3rd-party access to the data while making sure that they maintain all rights to use that same data as they please. If we create regulation that make data monopolies not only legal, but required, we will have screwed ourselves.

WHAT WE REALLY NEED IS:

1. increased consumer education. We need these giants to be forced to educate the public the same way Big Tobacco has to educate the public about the dangers of smoking.

2. Proactive, adaptable regulation that positively reinforces good behavior in a way traditional, reactive law doesn’t seem to understand. We need systems that allow little companies to try new things, and be nimble, while keeping the giants in check. And we can’t keep wasting money fighting these giants in court. Google just got a $2.7 billion fine by the EU last year, but it took a decade to lay down! That means Google was able to make nearly half a trillion dollars while a $2.7 billion fine floated in court — at scale, companies would love this. We have to realize that $2.7 billion to more than $500 billion in revenue is a wipe in the grass — it is merely a speed bump to their process.

3. More than anything, we need free exchange of data. Data is the infrastructure for the modern economy and creating regulation that eliminates the opportunity for free trade goes against everything we’ve ever known economically. Moving forward we must remain vigilant in protecting the right to freely exchange data while also doing whatever is necessary to make sure the public is safe.

Public safety, however, is mostly going to come down to education, because if the public isn’t data literate it doesn’t matter what kind of regulation we put in place, they’re still ripe for manipulation by bad actors.

If you liked this, you should sign up for the Design Good newsletter and join over 1,400 people who have already done the same! You can also purchase my new book, Automating Humanity, at designgood.tech to learn more about all of this in much greater detail.

Also know that as a 501(c)(3) non-profit, each purchase is tax deductible (a donation to the foundation) and 25% of all purchases go to youth technology literacy programs of your choosing. This means your purchase not only supports the mission of Design Good as a non-profit research org, but also funds the future generation’s education, which will help future societies thrive!

If you don’t see the program you’d like to donate to, let me know and we’ll make sure your favorite program gets added to the list!

--

--

Joe Toscano⚡️
RE: Write

CEO, DataGrade; Author, Automating Humanity; Ft, The Social Dilemma; Contr, Forbes. Changing the world w/ a smile, design & some code.