The unbearable lightness of regulation (and enforcement)
The Big Short is a movie I find myself returning to again and again. Each time I do I have to temper my irritation at its sexism in parts. For example, the decision to cut (out of the film) prominent real-life female players featured in the Michael Lewis book on which the film is based. Similarly, its tongue-in-cheek use of sexily clad (and unclad) women to explain complex financial instruments and practices really isn’t necessary. <Sigh>.
Having said that I do like the movie. What’s more, I think it does a really good job of showing how toxic a mix of under regulation (following a period of rapid deregulation) and under-resourced regulators can be and how they can lead to unpredictable and dangerous outcomes. The larger and more complex the ecosystem, the more unpredictable the outcomes and the worse the fallout.
Two things happened last week that brought The Big Short to mind. The first was the launch of Doteveryone’s Green Paper, Making the Case For an Independent Internet Regulator . The second, was the 2018 report of Deepmind Health’s independent review panel which I finally got round to reading.
Deepmind, you might remember, was the subject of some bad press following its collaboration with the Royal Free on its acute kidney injury alert app, Streams. The bad press centred on legitimate concerns about the volume of patient data that the Royal Free shared with a private sector organisation and the fact that the patients’ consent hadn’t been sought or obtained and most of whom had never had an acute kidney injury. As far as I can remember New Scientist broke the story, other media outlets picked it up and soon both the National Data Guardian (NDG) and Information Commissioners Office (ICO) were looking into the matter. The ICO and the NDG both took the position that the data sharing that took place during the trial had been inappropriate. When Deepmind Health was launched later in 2016, it established the independent review panel too. According to Deepmind, setting up the panel was “an innovative approach within tech companies” because it “forces us to question not only what we are doing, but how and why we are doing it.”
Anyway, in its second annual report the panel raised the ‘how do you make your money?’ question and was fairly forthright in pointing out the need for greater clarity about Deepmind Health’s ongoing relationship with its parent company, Alphabet. The independent panel based their review on a framework of 12 principles all of which have merit, but I was surprised but impressed by two in particular:
- Principle 5 (anti-monopoly): The company seeks to ensure
that it promotes competition, and encourages other organisations, including SMEs, into the market; in particular, the company will ensure that their systems are interoperable, using open APIs.
- Principle 11 (reasonable profits): The company will not use its assets or position to seek to extract excessive profits in its dealings with the public sector and will as far as possible operate contracts on an open book basis.
The panel’s assessment of Deepmind Health’s performance with regards to these two principles didn’t allay my concerns but I really like that the questions are being asked. I like also that Deepmind Health is opening itself up to this process. In its all too brief response, it acknowledged the panel’s concerns. Frustratingly, it didn’t provide any clarity on the important question of how it was going to make money but it did talk about the Verifiable Data Audit infrastructure it’s building. I’m actually really excited about this; I’m so weary of governance for data usage that can’t be audited or enforced.
An expectation of safety
So here’s a company apparently taking the initiative when it comes to doing the right thing. So why am I not in a happier place? It’s because while this sort of responsible corporate self-examination and accountability is laudable, it’s not enough. Corporate leaders leave (or get fired), shareholders make demands and organisations’ priorities change, we can’t rely on companies always finding the discipline and moral character to consistently do the right thing. Something the review panel noted in its report:
“Or, if DeepMind Health’s current management were to leave DeepMind Health, how much could a new CEO alter what has been agreed today? We appreciate that DeepMind Health would continue to be bound by the legal and regulatory framework, but much of our attention is on the steps that DeepMind Health have taken to take a more ethical stance than the law requires; could this all be ended?”
This is why we have we have laws and regulators who ensure adherence to those laws. They do this in the public interest — to protect people who are directly affected by companies’ products and services as well as those who might be indirectly affected by the externalities they generate. This protection also extends to business leaders who are trying to do the right thing because its expensive to do that when your competitors aren’t.
And that’s why, I’m not happy. I don’t think we have the right combination of laws and regulators who could meaningfully enforce them, to fulfill the expectation of safety that UK consumers have grown accustomed to. This is a dangerous place to be; attempting to navigate the new world using old rules makes it difficult to assess risk. Sometimes people sneer at the naivety of some consumers, quoting the axiom: ‘if you’re not paying for it, you’re the product!’ But that assumes that if you are paying for something, your ‘rights’ are guaranteed. They’re not, companies change their software terms and conditions post sale. And if your product is dependent on web-based software (as many consumer goods increasingly are) there’s nothing really nothing you can do about it. Something I argued in my 2017 TEDx Brum talk last year.
Also it can be genuinely hard to work out how companies make money (off you and your network) and assess whether you’re alright with it. In a recent survey, Doteveryone found that 24% of respondents admitted to not knowing “how [tech] products and services make money”. An even higher proportion (45%) were unaware of how targeted advertising worked.
But even if you’re informed and digitally savvy, interconnected digital ‘supply chains’ and pervasive tracking can still make it hard to know how to opt out. And that’s before we factor in the fact that social data is often about multiple people not just about individuals (so it’s not down to just what you decide, something the ODI has been arguing for a while now) or the fact that power dynamics mean that even if you know you’re been exploited you can’t always afford to opt out. For these reasons and more, we need strong, independent regulators.
Pervasive regulatory fatigue
And that brings me to the other event I mentioned above, the launch of the Doteveryone’s green paper. Specifically the panel discussion and a point Ed Vaisey made about the pragmatism of real life regulation — it’s a collaborative process between regulators and the industries they oversee. Well, yes… and no. I think the scenario he describes works well in mature industries where there is relative stability and both technological and operational innovation have slowed down. A status quo has emerged. In this situation it’s not really in anyone’s interest to rock the boat. Regulators and businesses are kind of on the same side. Think banking before it got sexy thanks to extensive. It’s easy to see how oversight might be implemented in this sort of environment. And it’s just as easy to see how its possible to believe that defunding regulators isn’t a big deal or that an industry- funded one will suffice.
Alas, this sedate status quo is not the situation in tech. Sometimes it seems like there’s nothing but advantage in testing the boundaries. Remember when Google’s Chairman and former CEO, Eric Schmidt, said Google’s policy was to “to get right up to the creepy line and not cross it?” And the truth is if you’re big enough and your technological infrastructure is advanced enough you can probably outwit those who might seek to enforce those boundaries (see Uber’s Greyball app) for a while. Or at least tie them up in court (see Airbnb’s 2016 legal battle with New York lawmakers or if you’re after a non-tech example, Vote Leave campaign‘s appeal* of the Electoral Commission’s findings that it breached spending rules).
AIl this adds up to one thing: effective regulation is a costly and labour intensive business. Given current funding levels and by extension, capacity and skill levels of many regulators is it any wonder that regulators are showing signs of fatigue? How else to explain the FCA decision regarding the Barclays CEO Jes Staley’s repeated efforts to uncover a whistleblower’s identity even though it seems an obvious violation of the spirit of the Senior Managers Regime regulations? Or the Financial Reporting Council’s inability to successfully take on the ‘big 4’ audit firms despite widespread criticisms including from parliamentarians following high profile company collapses most recently, Carillion?
If not this, then what?
So, regulating tech is going to need a different approach but what should that look like, and are there lessons from history, other sectors and most critically The Big Short that we can draw on? I think so. But that’s the subject of another post.
For now, I’m going to leave you with a quote from the The Big Short. It describes the impact of the extension of the bond market in a way that I think is analogous to the growth of the Internet, “The creation of the mortgage bond market, a decade earlier, had extended Wall Street into a place it had never before been: the debts of ordinary Americans”.
Like Wall Street, the digital world’s integration with everyday life has passed the point of no return and into pretty much every aspect of our lives. The truth is whether individuals choose to engage directly with it or not, we’re all affected. If the ability to opt-out can’t be guaranteed then the right to protection must be enforced. We need effective regulation and independent regulators for that.
*Obviously this is their legal right.