Take it on Trust

I have a guilty secret.

I am loving Alexa in my kitchen. The Echo Dot from Amazon tells me the BBC headlines when I want them, plays the latest Archers episode on demand, and whatever music I want to hear. Alexa sets timers and does conversions for me as I cook. And the voice recognition software is working brilliantly.

But the more brilliant the technology is the more worried I am becoming.

I recently had a conversation with John Wilbanks. John has been thinking for some time about how to ensure data sharing is used for good, and not just to boost corporate profits in Silicon Valley. He referred me to the interesting organisation Digital Public. Building off the cliché that data is the new oil for the digital economy, he neatly described Google, Amazon, Facebook and Apple and Netflix as the equivalent of OPEC in the 1970s.

With the Data Protection Bill currently going through the House of Lords, I too have been thinking about these issues. In my initial speech on this legislation I welcomed the advances in protecting data privacy that the EU’s GDPR, and this bill, offers; but I also raised ongoing concerns about the power of the big technology companies.

These concerns have been amplified by the decision in the US to end net neutrality. This is the opposite of the new European regulation, and is the ultimate in free market de-regulation gone mad. It allows for vertical integration of content and infrastructure to further lock us into what corporations want us to see, read and understand.

Today I asked Alexa what she does with my data. She replied that “I only send audio back to Amazon when I hear you say the wake word.” She then referred me back to the Amazon website for more information.

Being reassured by that requires a lot of trust. Trust that she is really asleep and not listening between “wake words”. Trust that the data that is then gathered is used responsibly and ethically. And trust that however benign it may be now, that it won’t then change the data use in a direction I am unhappy with.

I am not happily enjoying using Alexa because I trust Amazon. In the end, I am using it because I love voice as the best interface with technology when I am in the kitchen.

But I still worry about my data privacy.

And I am not reassured by the argument that “I am not doing anything wrong and have nothing to hide”. There are too many examples through history around the world of people doing nothing wrong who find that the rules change, and their past behaviour comes back to bite. With the growth of the Internet of Things we also find innocuous seeming things like robot cleaners, and even vibrators, collecting data about our private lives.

In my speech in the Lords, I concluded by saying:

“The use of data to fuel our economy is critical. The technology and artificial intelligence it generates has a huge power to enhance us as humans and to do good. That is the utopia we must pursue. Doing nothing heralds a dystopian outcome, but the pace of change is too fast for us legislators, and too complex for most of us to fathom. We therefore need to devise a catch-all for automated or intelligent decisioning by future data systems. Ethical and moral clauses could and should, I argue, be forced into terms ​of use and privacy policies.”

Since delivering that speech in October I have been thinking about how this ethical capstone could be put over the activities of those that harvest our data.

My conversation with John Wilbanks led to me thinking about data trusts and data co-ops as a possible way through this tricky policy challenge.

Data Trusts are the basis of the first recommendations from the review by Wendy Hall and Jérôme Pesenti on growing AI commissioned by the UK Government. Their report suggested a narrower use than I am envisaging but is rooted in the same idea.

Strong technology needs rich data, especially artificial intelligence. Rich data needs harvesting from multiple sources. As trust in technology erodes, consumers may decide to forego the beautiful user experience of something like Alexa because they don’t trust companies like Amazon, and they want them to pay fair levels tax in each and every jurisdiction they are trading in. This is where a trust steps in.

Data trusts work off the principle that an individual’s data is worth very little, but in aggregate with enough others is worth a lot. This is especially powerful if there is a concentration of data from a particular set of patients, or transport users, or local residents. Co-operatives work in similar ways, but with a more revenue generating and sharing mindset.

These data aggregates are formed with clear aims related to how data is shared, with independent trustees appointed by the data contributors to solely pursue those aims. There could be a variety of trusts. I might choose one that is more permissive — I already use Alexa and a fitness watch. Others, with higher distrust, could choose a different trust with different attitudes to sharing.

But why would Amazon agree to give up so much power and value?

That is where government regulation comes in.

The current UK Data Protection Bill allows for data portability and the “right to be forgotten”. Technology companies will be required to make it easy for us to take our data away from the likes of Facebook and have our identifiers on our accounts deleted. This alone could be powerful, especially if repeated elsewhere in the world.

In an age where online petitions can gather steam quickly, there are many examples of corporations having to respond to public outcries — witness the fate of Travis Kalanick the founder of Uber. During such an outcry, consumers could be encouraged to act by using their new powers under GDPR to move their data and close their accounts.

This is the modern equivalent of industrial action. But instead of just withholding their data, as if on strike, they move it to a new entity. If enough move, the company would need to negotiate with the trust, the new owners of the collective data.

This still feels far off. It would need political organisation or it could be helped with another layer of government intervention.

For example, a government could decide that children’s data is particularly sensitive. It could then mandate that all children’s data is held in trusts that are independent of the commercial entity. Those trusts then provide a level of ethical safeguarding as the owners of the data.

These ideas feel like an exciting solution to my worries. To many they will seem naïve. But perhaps the first moves to organised labour by Robert Owen, the Tolpuddle Martyrs and the Chartists seemed naïve in trying to use collective action to force ethical safeguards in the last major industrial revolution.

It is time for us to organise around the politics of data. We are fast becoming more valuable for our data than our labour. Politics needs to respond accordingly.