How to hold platforms accountable without heavy regulation

Or, what is an “information fiduciary,” and how can it build trust?

Nancy Watzman
Trust, Media and Democracy
4 min readSep 19, 2018

--

Source: Merriam-Webster.

Daily life is an exercise in trust. I trust that when my doctor tells me I need surgery it’s her best judgment that I do, based on the evidence available to her. I trust that the investment advisor who oversees my retirement funds is acting in my interests. I trust that the real estate agent who sold me my house did not hide any information from me. It’s not that my trust is blind–I seek second opinions and read financial disclosure documents. But I also expect a certain level of professionalism because these people have a fiduciary duty to act in my interests.

Today Jonathan Zittrain, Harvard law professor and a member of the Knight Commission on Trust, Media and Democracy, argues in Harvard Business Review that the establishment of policy of “information fiduciary,” which he has been working on along with Yale Law School’s Jack Balkin, could be a useful tool in another environment where we’re asked to trust: to hold social media platforms accountable.

Jonathan Zittrain, Harvard law professor and a member of the Knight Commission on Trust, Media and Democracy, argues that the establishment of policy of “information fiduciary” could be a useful tool to hold social media platforms accountable.

From the article, which can be read in full here:

Our use of information platforms, whether search engines or social media, originally was not much tailored to anything about us, except through our own direct choices. Your search engine results for a term like “are vaccinations safe” would be the same as mine, or, for a term like “pizza,” varied on something as straightforward as our respective locations, to offer nearby restaurants. …

Today, that’s not true. We’ve moved from a world of pull to push, where people don’t search for specific things but read whatever feeds they’re handed by services like Facebook and Twitter, and thus are much less involved in prompting what they’ll get back. And more and more, people are experiencing not a range of search results from which to choose, but a single answer from a concierge like Amazon’s Alexa. These concierges might rouse themselves to suggest that it’s time to procure a gift for a friend’s birthday (perhaps from a sponsor), or insist on recommending Uber over Lyft when asked to procure a ride (again, thanks to sponsorship)….

We’ve also moved to a world where online news feeds — or concierges’ answers to uncommon questions — are aggressively manipulated by third parties trying to gain exposure for their messages, and there is a great concern about what happens when those messages are propaganda: that is, false, and offered in bad faith, often obscuring their origins. Elections can be swayed, and people physically hurt, by lies. Should the platforms be in the business of deciding what’s true or not, the way that newspapers have purported to do for years? Or does that open the doors to content control by a handful of corporate parties — recognizing that Facebook controls access to far more eyeballs than a single newspaper ever did — or by the governments in a position to regulate them?

What would establishing the concept of information fiduciary look like? There are a number of ways forward. For example, Zittrain and Balkin have proposed that companies could become fiduciaries by choice, rather than falling under a law, if for example U.S. federal law offered relief from individual state requirements. The incentive here is not dealing with 50 different legislatures but rather one standard.

Second, Zittrain writes that firms should have a means for ethical issues to be surfaced, discussed internally and then disclosed externally. This is a tougher ask, as so many of the issues raised are new ones and there’s no clear answer. There should be a way for employees to raise concerns without fear of being punished. This concept exists in some other fields, such as medicine, where it’s known as “just culture.”

Third, he argues there should be a way for firms to seek opinions from any public body that has authority to make judgments about new issues before they implement changes. This is similar to when taxpayers ask the Internal Revenue Service for a “private letter ruling” before they commit to one tax strategy or another.

Zittrain makes the case that we can’t count on platforms to have our best interests at heart unless we create structures for accountability. As he writes in HBR:

The problems arising from a surveillance-heavy digital ecosystem are getting more difficult and more ingrained. It’s time to think of a comprehensive approach, sensitive to the complexities and geared towards addressing them as they unfold, centered on loyalty to the individual users whose data is otherwise weaponizable against them.

--

--

Nancy Watzman
Trust, Media and Democracy

Nancy Watzman is director of Lynx LLC, lynxco.org. She is former director, Colorado Media Project; outreach editor, Knight Comm on Trust, Media & Democracy.