A privacy talk with WhatsApp co-founder Brian Acton
This is an unedited phone interview WhatsApp executives Brian Acton, co-founder, and Matt Steinfeld, head of communications.
They spoke in light of a survey asked for by the company regarding Brazilian users and online messaging privacy and security. I published a story on the newspaper Folha de S.Paulo as a result of this talk.
The results were somewhat predictable: 94% of Brazilians that use the service care about privacy; 71% share sensitive information (personal/family matters; documents; financial, work- and health-related issues).
Around three-quarters of them said they were against judicial orders like the ones that ended up with a temporary ban on WhatsApp — because the company said it couldn’t comply with the handing over of chats from criminals under investigation.
Q: I took a look at the survey results. Why did you [the company] order it?
Acton: A lot of this revolves around the facts that Brazil is critical and one of our most important countries. And we wanted to sort of understand how people in Brazil feel around encryption, and the sharing of personal information, etc.
The results… we found that the vast majority of users are sending very personal information, this is information they want to keep safe and secure. Providing an end-to-end encryption is something that provides that. It gives people that peace of mind that conversations aren’t being listened to or surveilled. And they feel more comfortable sharing even more trusted information, transactions and credit cards. Stuff like that. Through our systems.
They [users] feel more comfortable sharing even more trusted information, transactions and credit cards. Stuff like that. Through our systems.
Overall the survey went well but it was important to us that we did this with Brazil. Because of course is one of our most important countries.
Q: Do people feel safe while doing that type of thing?
Absolutely, they feel safe. More and more people are starting to understand and comprehend what end-to-end encryption means. Through the organizations like yours that help to educate people, and help them understand that end-to-end encryption is actually what everyone should want and need to promote a safer world.
Q: You mention the media. The Guardian published a story about a vulnerability [in the app] that got quite a response. How was that received inside the company, and how do you feel that the public in general should react to it?
I think that security community at large rallied behind us. And exposing… really not. That is not a vulnerability, it’s really a feature of our product that if people feel really good about their conversation and they trust the conversations, then they can turn off certain notifications and not worry.
It’s a feature of our product
We do have embedded security notifications in the app. In many ways they sort of debunk what The Guardian was saying. And as a result many of the security community rallied behind us, including the EFF. The EFF actually wrote up a pretty good [article] sort of debunking The Guardian article.
In short we are trying at all times to balance privacy and security with simplicity. In some cases, we made small trade-offs, not trade-offs, we built specific features and capabilities. At all times, if people want the most secure experience, we encourage to turn on the security notifications and verify when those notifications alert users. That’s the best protocol to employ.
The vast, vast, vast majority of users really don’t care. So we felt that it was better to err on the side of simplicity, with the default being off. And, you know, we are looking into improving security of the product, and having the dialog with The Guardian is a good dialog to have just because it’s thinking in new and creative ways that make our product better.
We are trying at all times to balance privacy and security with simplicity. (…) The vast, vast majority of users really don’t care
Q: The app Signal [that uses the same encryption protocol] doesn’t deliver messages right after an user changes their [security] key. (This is where WhatsApp supposed “backdoor”, as reported by The Guardian, resides). Is that something that you would be looking at?
We’re always considering different options. As it stands today, what we know happens with a very large number of device delete/reinstall events which are… frankly they are false positives when it comes to the security notifications. We just didn’t want to inundate people with security notifications and create sort of an alarmist reaction. We’d rather support the people who are especially paranoid and give them this setting or capability.
We just didn’t want to inundate people with security notifications and create sort of an alarmist reaction
Part two is we just don’t want messages to stop flowing. Right. And when you start to do things like hold messages, or you drop messages because the security changes, you are actually impeding… you are adding more friction that can be detrimental to the health of the system. So we made a very deliberate choice, that we wanted to get their messages alongside the security notifications.
Q: The survey included the cases when judges in Brazil blocked WhatsApp access because of a non-compliance with investigations. And this is very likely to happen again.
We always are working hard with law enforcement to support law enforcement requests. The real struggle for us is we don’t have this ability to produce data at their request. And so it’s a hard message to receive for law enforcement but it’s frankly the best that we can do at the moment.
Q: You’re saying you simply have to answer you can’t help them.
We’re unable to [comply]. Our system is design so it doesn’t allow for the encrypted messages that are flowing from our system. So we don’t really have the mechanism for providing law enforcement with unencrypted content.
Q: Facebook’s page about governmental requests include the “restrict content” section. Facebook says they sometimes agree to do it. Is that ever the case with WhatsApp?
Steinfeld: There is a difference between the two services. Facebook, as you know, is a place where people go to share content with friends or share publicly. The government requests, when that makes reference to content restrictions, that notes for when Facebook is required to make some content inaccessible in that country. To put that message in context, not something that WhatsApp has the capability to do.
In certain countries like Germany and France, for example, it’s illegal to deny the holocaust. So under German law, Facebook has to restrict posts that deny the holocaust. Those are differences between Facebook and messaging apps like WhatsApp work.
Q: When that happens inside WhatsApp, that’s impossible to know, right?
Acton: Exactly. And the important part there is that the end-to-end encryption helps to protect that. People’s communications can flow and they can flow in a private basis. I think there’s a big difference between public vs private. How people use Facebook and how people use WhatsApp.
Q: The security page inside inside WhatsApp’s settings menu says that “whenever possible, the encryption is gonna happen”. When is it not possible?
There’s certain pieces of information that, for example, a group chat title is not encrypted. Your profile name is not encrypted. There are some small pieces of data that are used in sort of the transport of messages but to… this is sort of metadata realm of messages. This is not actually the messages’ content itself. All messages are encrypted.
Q: Even in older phones?
In every platform supported by WhatsApp, yes. Android, iPhone, Windows Phone etc.
Steinfeld: One thing to remember when we rolled out end-to-end encryption back in April everyone had to be on the latest version of the app so there was a period when if you were communicating with someone who wasn’t on the latest version of WhatsApp, that was almost a year ago. Now every message is encrypted by default.
Acton: We in fact will update the language this year to even make it stronger.
Q: Recently, Facebook said it had stopped to use WhatsApp users data to display targeted ads in Europe. What type of data was being used and is this happening in other parts of the world?
Well, prior to the [updated] terms of service, we were not sharing any data with Facebook in any way. In a general, sort of high-level after the terms of service, the critical piece of information that we share is the telephone number, as a means to help Facebook add to the product experience.
Q: So basically what’s shared is a phone number, that’s then linked to an account, a profile on Facebook. Is that right?
That’s the general idea, yes.
Q: And how does that help on displaying ads?
Facebook runs a pretty sophisticated ad targeting system. To be honest with you I’m not the expert on it. But insofar as they have data sources that have telephone numbers in them. They are able to cross-reference it and show appropriate ads as a result.
Steinfeld: To go in more detail, Facebook has a product called Custom Audiences. Let’s say you’re Folha de S.Paulo [newspaper] and you ran an ad campaign to existing customers, that Folha has their phone numbers. They can upload these phone numbers into Facebook to target its existing customers, because your phone number is your account ID on WhatsApp. If you have a Facebook account, that is a way to identify you as someone who might want to see that message from Folha.
I think the important thing to notice is that when we announced the update last year, we gave the opportunity for users to opt-out of sharing WhatsApp information for those kinds of purposes, like ads.
Q: There was this toggle inside the privacy settings. But I don’t see it on my phone. I know you can opt out before you agree to the user terms. Has that setting changed or is it just my phone?
Steinfeld: we gave people the opportunity to opt out before they agreed to the terms and after that we gave users a 30-day period after they agreed to opt out as well. So we updated in August by now we are past that 30-day window.
Q: For long-time users, is there any way they can do it?
No, not at this time.
Q: Unless I change my number, for instance.
[Even] if you change your number you go through the new terms of service flow, and the opting out is not available in that flow.
Would you want to tell me how the relationship with Facebook has been since the company moved to Menlo Park?
The relationship has been great. We are now in headquarters, building 10, and we had a great welcome by the Facebook, the bigger company. In terms of… it really facilitated the collaboration with a lot of the Facebook teams. It’s also given us the opportunity to use more of Facebook resources, infrastructure etc.
And if you think about it it’s sort of a next logical step as part of an acquisition. Get more people, people excited about the product etc. It’s been a good move and Facebook has been supportive of our move even to Menlo Park. Things are going good. Our employees are really happy with the move.