WhatsApp, Trust, & Trusts

Companies aren’t designed to keep promises. Trusts are. We should use them together to build trustworthy digital spaces.

Consent and WhatsApp’s New Terms of Service by Pat Walshe © 2016

Yesterday, WhatsApp changed its Terms of Service in order to share metadata with its parent company, Facebook, and allow for unsolicited advertising messages to be sent to users. There’s been a range of tone in coverage, from “I told you so,” to “here’s how to opt-out” to “quit WhatsApp, use Signal,” to“holy shit can they really do that.” The privacy community isn’t wasting any time responding — UK regulators are promising an investigation and US advocates are planning a complaint to the Federal Trade Commission (FTC). Both are suggesting deceptive practices and violations of consent protections, and are responding in their jurisdictions. Whatever the repercussions, the regulatory and market response will be interesting statements about public tolerance for digital impunity.

WhatsApp’s two defining promises were privacy and absolutely no advertising. Yesterday, they broke both of them — striking a big blow to users, advocates, and the idea of independent, commercially viable messaging. This isn’t to beat up on WhatsApp, it’s to point out how ill-suited companies are to keep any kind of promise — especially if they’re acquired for an astronomical multiple of their annual revenue by an advertising company with a competing product in the market.

Companies, especially public companies, have a legal, fiduciary duty to maximize profit above all other things. That means the only legal way to justify user protections that mitigate monetization is to convince shareholders (the people who are making less money as a result) that user protections are critical to the brand’s value. WhatsApp is one of a very few platforms with the network lock-in effects of a 1B person user base, so it’s not surprising that the people who invested $20B+ in it want to squeeze it for revenue. And it may pay off — people’s reaction to the change will show whether a little broken trust and monetization are enough to significantly impact use.

It’s hard to know whether Jan Koum — WhatsApp’s co-founder and the person who made a lot of WhatsApp’s privacy promises— supports the shift. Any speculation now is academic, given Facebook’s ownership interest. It’s worth noting, though, that Mr. Koum has sold more than $2.8B of his Facebook stock since January — which is also when WhatsApp changed its business model (dropping the annual subscription cost).

As most major tech companies are figuring out (mostly the hard way), there aren’t a lot of ways to monetize messaging platforms without charging end users, providing value added services, or selling data. This is especially true for private, peer-to-peer messengers, because charging end users rarely produces investor-satisfying exponential growth — and the other two options create security and/or privacy flaws. So, this was inevitable. Not for the reasons that most security researchers emphasize, but because of pretty basic market incentives and legal requirements. Which asks whether this really was deceptive — did Mr. Koum make those promises knowing that he’d sell them and user trust?

The question is important, not because it would change anything now, but because if it wasn’t deception — if Mr. Koum truly wanted to protect WhatsApp users’ trust — he could have. And the way he could have done it is literally called a “trust.”

Trusts are legally binding contracts that do one very important thing — they create a fiduciary duty. A fiduciary duty is a legally enforceable responsibility to a group of people, called beneficiaries. Just like corporate directors have fiduciary duties to shareholders, trusts can be used to create fiduciary duties to users. WhatsApp could have put some of their assets (likely intellectual property) in trust for their users, creating a legally enforceable fiduciary duty to protect their privacy and/or never allow advertising. If that duty existed, users could have taken legal action when Facebook made this decision, instead of relying on regulators and non-profit advocates. Even better, WhatsApp could have used a Civic Trust, embedding user governance into the platform itself.

Ultimately, every platform is governed by a group of people — and fiduciary duties determine how they do it. Right now, those duties are almost all aimed at profit, but they don’t have to be. Founders of technology products and companies can use trusts to create new duties, protecting the promises they make to users at least, and building ways for users to protect themselves at best. The emergence and growth of B Corps are a signal of founder and investor interest in diversifying the promises companies can keep. WhatsApp’s about-face is a textbook example of why founders and early stage companies should create duties that companies must consider.

With trusts, we may be able to build the companies that we can trust to keep their promises — which seems like a pretty basic requirement for those who will own the platforms, networks, and technologies we rely on to build the future. If Facebook’s WhatsApp should teach us anything, it’s that we have a ways to go before we can trust the promises of a company. Trusts may just fix that.