Facebook Privacy: Lessons Learned from Congress’s 2000+ Follow-up Questions

Demystifying hundreds of pages of answers and fluff

Anthony Liu
HackerNoon.com
Published in
8 min readJun 21, 2018

--

This April, Mark Zuckerberg testified before Congress to address how Facebook handles your data. In May, Congress sent over 2000 follow-up questions, and earlier this month, Facebook released hundreds of pages of responses. I read through those answers, and in this article, I’ll present the biggest takeaways in the context of other recent news.

What Facebook collects, from bad to WHAT??

Many of Congress’s questions related to what information Facebook collects on users in the first place. There’s the obvious stuff. The posts, the likes, the photos. But there’s a lot of information Facebook collects that people wouldn’t expect.

Like many other companies, Facebook lets users sync their contacts. To Facebook, that means your call and text history. Creepy, sure, but Facebook promises it’s just to find your friends… right?

Unclear. In their official responses to Congress, Facebook reveals they send information about your text messages and calls that isn’t at all necessary for this feature (“Senate” 12). In fairness to Facebook, users need to actively click “okay” to share their call and text history. But there’s a lot that’s shared that you don’t have the chance to consent to.

Facebook grabs everything from your cell signal strength, to your battery level, to the plugins in your browser (93). Each of these traits is innocuous enough on its own, but taken together, they uniquely identify you on the internet, even if you’re not logged in. As clever as you may think you are switching to incognito mode, you’re still at 62% battery, and you’re not fooling anyone.

Facebook also pays attention to the WiFi access points and Bluetooth devices you walk by as you go about your day (94). Doesn’t relate much to your experiences on Facebook, WhatsApp, Instagram, or Oculus. The only use I can think of is tracking where you are in the world.

They also track where you are on the web. There are 8.4 million websites that embed the Facebook Like button, allowing Facebook to record which websites you visit as you’re surfing the internet. Facebook admits to using this approximation of your browser history to run targeted ads if you’re logged in (159). They promise they don’t use this info to target you when you’re logged out, but they still store the websites your device visits.

For some of you, the story may even be worse. Onavo is a user-friendly VPN app that Facebook bought in 2013. Normally, VPNs connote security, privacy, and protection, but Facebook transformed Onavo into a vehicle for spying on every byte sent and received from your phone. In Facebook’s own words: “When you use our VPN, we collect the info that is sent to, and received from, your mobile device” (124).

If you were racing to the “continue” button, would you notice that Onavo was a Facebook product?

Clearly, Facebook does a phenomenal job learning everything about your online presence. Unfortunately, that’s not nearly enough.

Businesses send Facebook information about you that ordinarily only they would know, like how long you were playing Fortnite yesterday to which purse you bought at your local JCPenny’s last week (18). To put this in perspective, there are more than 70 million businesses using Facebook. And yes, you read that correctly. Facebook knows about actions you take outside of the internet, and they receive this information from partners regardless of whether or not you have a Facebook account (18).

Tips and tricks to get even more data

Facebook has a long relationship with data brokers, companies that buy and sell consumers’ personal data. Although they’ve taken recent steps to distance themselves from these brokers, Facebook has yet to provide any transparency into the data they collect from these mysterious, sketchy companies. Facebook’s tool to view the data they claim they have on you only shows a fraction of what they actually know (119).

Most people haven’t heard of data brokers before, so let’s look at a fun example: LocationSmart. LocationSmart tracks the real-time location of every mobile device in the US. They work with major carriers like Verizon and AT&T to collect the data necessary to provide this service. All their customers need is your phone number to retrieve an accurate estimate of where you are.

And last month, news broke that LocationSmart had a vulnerability that allowed _any random person_ to submit a query and find your real-time location. It took Verizon, AT&T, Sprint, and T-Mobile over a month to decide to stop selling your location information to this irresponsible data broker. The worst part is there are many dozens of data brokers that deal with all categories of your sensitive personal information.

On a more positive note, the EU’s General Data Protection Regulation was a huge win for consumer privacy around the world. Unfortunately, upon its enforcement on May 25th, 2018, Facebook had to get users to agree to some policy changes, and this formal complaint alleges that Facebook faked notifications so that users would agree more quickly.

Fakebook faking a notification to get you to skip reading a policy change faster.

From the complaint:

The only option for a user was therefore to accept the new terms and privacy policy, or to delete the account. There was no option to disagree, opt-out or say no in any other way, shape or form.

Don’t let vulnerable public apologies from Facebook’s CEO lower your guard. Even after Facebook promised to change for the better, it seems they’re still willing to do whatever it takes to get more of your data.

For instance, Facebook analyzes the content that your friends share, like photos and events, to infer information about you. They know where photos were taken from their metadata, and they know which photos you’re in thanks to facial recognition (142–143, 222). Ergo, they might know where you are even if you didn’t tell them.

Who they share your data with

Collecting data isn’t inherently evil. It’s unsettling for an organization to know so much about our lives, and LocationSmart showed us why centralizing data is terrifying, but we ought to give Facebook a chance. What do they do with our data?

The data Facebook has on you helps them show you extremely relevant ads. They don’t need to share your data with any third parties to do this. They have your information, and they serve the ads. In a nutshell, this is what makes Facebook so much money.

We can’t fault Facebook for doing this — they provide us a free service, and we agreed to their Data Policy when we signed up (even if we didn’t all read it). This much, people expect and understand. But of course, there’s always more.

Facebook uses its data to do a lot of research, and I’m not talking about all the neat AI stuff that FAIR is up to. I mean controversial social experiments that toy with tens of millions of unsuspecting people’s lives (99). In 2012, Facebook wanted to experiment with voter turnout, so they manipulated the news feed of every American user over the age of 18 that used Facebook on November 2, 2010. In total, their experiment involved more than 60 million people, and they demonstrated that they could measurably impact voter turnout by varying the content they showed to each user.

To be explicitly clear, these 60 million people did not consent to taking part in this political experiment. What if Facebook had innocently decided to experiment on Republicans and Democrats differently, not yet knowing what impact, if any, its platform would have on voters? What if Facebook had accidentally tipped the scale for congressional elections across the country?

I’m not alleging that Facebook has ever used this gross power to manipulate elections in the past. I’m not even claiming that there is a risk that they ever will. I’m pointing out how dangerous it is for a centralized platform like Facebook to have so much control with so little active consent.

Let’s take a look at another example from 2014: Facebook’s infamous mood experiment. By changing the content of people’s news feeds and analyzing the sentiment of their statuses, Facebook tracked the impact they had on the positivity and negativity of people’s posts. If you were one of the nearly 700,000 people experimented on in this study, then Facebook may have been the reason you had a sadder week than normal on January 11–18, 2012.

But Facebook never asked for your permission, so there’s no way for you to know.

By the way, Facebook told Congress there’s no way for you to opt out of these experiments (125).

On the bright side, at least these researchers have good intentions. They’re not out to get you; they’re just pursuing knowledge at all costs and using the resources available to them. What about the other people Facebook shares your data with?

By now, you’ve probably heard something about Cambridge Analytica, if only the name. Here’s the tl;dr: Facebook shares your data with the apps and games you use, and for a time, they also gave those apps your friends’ data. A company called GSR used this to collect data on up to 87 million Facebook users, including their private Facebook messages (24). They sold this data to third parties like Cambridge Analytica that used it for nefarious election purposes.

The people directing GSR were Joseph Chancellor and Aleksandr Kogan. Here’s an odd fact: Facebook hired Joseph Chancellor around the time they found out about the Cambridge Analytica fiasco (203).

Relatedly, there’s been a lot of talk about Russian advertising meddling with the 2016 election. The Russia-based IRA, who attempted to “deceive and manipulate people in the US” by using Facebook to mess with voter turnout, successfully reached up to 126 million Facebook users (205).

In these cases, third parties — not Facebook — were the villains abusing you and your friends’ data. There’s no way to know how many other entities there are like this, but we can estimate. Since 2006, Facebook has “sent over 1,150 cease-and-desist letters to over 1,600 targets” and restricted or removed more than 300,000 apps for violating users’ privacy (129). Clearly, Facebook is hard at work auditing a ton of apps, but the fact that this is such a huge problem leaves me feeling uncomfortable.

Parting thoughts

It probably seems like Facebook is the bad guy, but having read their replies to Congress’s million questions — and many of their responses were total dodges, by the way — I don’t fault Facebook for too much. Facebook’s mission is just incompatible with charging its users; you can’t connect the whole world with a paid app. Maybe the users that can afford to pay can cover the cost for those who can’t. But Facebook is a business, so they opt for the more straightforward solution.

Companies aren’t in the business of taking advantage of your privacy. They’re just stuck in a system in which the only business models that make sense are those that require them to monetize what they know about you. As the amount of data we produce increases, this problem will only grow in scope, so one way or another, through regulation, technology, or plain old transparency, something about the data ecosystem has got to change.

[1] “Senate commerce committee questions for the record.” <https://www.commerce.senate.gov/public/_cache/files/...>.

Thanks for reading. If you enjoyed this article, clap it up and/or start a discussion in the comments below.

Until next time!

--

--

Anthony Liu
HackerNoon.com

Making it easy to use the next generation of domain names @ https://namebase.io. MIT '18 interested in AI, privacy, music, and philosophy. Tweet @imigliu