Designing digital services that are accountable, understood, and trusted (OSCON 2016 talk)These are the speaker notes and slides from my talk at OSCON 2016 last month.

Hello.

Welcome to this session about power and importance of designing digital services that are understandable, accountable and trusted.

I’m hoping to convince you that designing digital services that are understandable, accountable and trusted is now a commercial as well as a moral imperative, and that building an open society in the digital age is about more than open code.

And that, whether you like it or not, if you work in software or design in 2016 you also work in politics.

My name is Richard Pope. Quick bit about me.

I worked for the Government Digital Service for 5 years, part of the team that delivered GOV.UK. I then went on to work with teams across different government departments and policy areas.

Before that, I setup the labs team at Consumer Focus, the UK’s statutory consumer rights organisation, building tools to empower consumers. I’ve worked at various commercial startups including moo.com and ScraperWiki.

I co-founded the Rewired State series of hackdays back in 2007 that aimed to get developers and designers interested in making government better.

The last piece of work I did in government was on a conceptual framework for the idea of Government as a Platform. Government as a Platform is the idea of treating government like a software stack to make it possible to build well-designed services for people.

The work involved sketching some ideas out in code, not to try and solve them upfront, but to try and identify where some of the the hard design problems were going to be.

Things like: what might be required to enable an end-to-end commercial service for buying a house?

Or what would it take for local authorities to be able to quickly spin-up a new service for providing parking permits?

With this kind of thinking, you rapidly get into questions of power.

What should the structure of government be?

Should there be a minister responsible for online payment? Secretary of State for open standards?

What does it do to people’s understanding of their government?

How and where should you build in opportunities for recourse for when things go wrong (which does happen from time-to-time).

This is the Arthur Dent problem. In Hichhickers guide to the When he has just worked up to the surprise that the council are going to bulldoze his house:

And that, at the heart of it, is the subject of this talk.

How do we build stuff that people can understand and trust, and is accountable when things go wrong?
How do we design for recourse?

I’m hoping to convince you that, whether you like it or not, you also work in politics. That software is politics.

This image is from a book called ‘The New Anatomy of Britain’ the journalist Anthony Sampson.

He published a book approximately every 10 years from the 60’s on the subject of power in the UK. Each book included a diagram — this is the one from 1962.

The size of the bubble denotes how much power he thought that group had. In this one, the Aristocracy’ is still significant enough for a mention.

This is the one from the early 80’s. The Civil Service looms large (This was the age of ‘Yes Minister’. Trade Unions and nationalised industries also feature.

In 2004 the centre of government, in the form of the prime minister is shown as a lot more powerful. As is what we’d now call ‘traditional media’ (Fleet Street / TV). Despite being the age of ‘Web 2.0’, digital doesn’t get a singled out.

I think these diagrams are useful because I think they demonstrate a couple of things really clearly.

Firstly, they reduce things down to an abstract landscape of power. There is only power — be it commercial or political, accountable or unaccountable.

Secondly, they show that power is mutable: it changes over time, and it can be change, and that can happen quite rapidly too (as the changing power of unions illustrates).

But, fundamentally, they show this: that politics is about the distribution of power in society.

In the second decade of the 21st century, digital services — code and design — are changing how power is distributed.

To illustrate how, I’m to going to give a couple of quick examples, from different sectors:

Society’s ability to regulate industries effectively is limited by its ability to access and understand code, as we saw with the VW emissions scandal.

Research from Carnegie Mellon University found that men were far more likely to see Google adverts for high paying executive positions than women.

The decisions developers make about how they model adverts on job publishing platforms have an effect on people’s ability to find work.

Debates about workers rights are increasingly debates about code. The employee tracking and conditions that are coded into the platforms of Deliveroo and Uber are changing what it means to be employed.

The other day, there was the suggestion that the slump in the pound was the result of algorithms feeding on news stories from social media. (Possibly some of these could have been generated by bots. It’s bots all the way down!)

Facebook’s dilemmas around news algorithms are pretty well documented. The decisions they are faced with about censorship and quasi-regulation are the things that, historically, only nation states have had to deal with (but ofcourse Facebook lacks the opportunities for recourse that nation states have in place — at least in the western tradition).

Connected devices are redefining what privacy — a fundamental human right — means. And we are, in turn, being asked to trust opaque machine learning systems with that data. This is a screen-grab of Google Now requesting access to a Nest thermostat.

Uber is winning the battle for the future of personal transport through a combination of legal brute force and amazing service design.
But, long-term, could we see them replacing democratically accountable civic transport networks?

And that is not the only area the ability of our elected officials to make transport policy is being impaired.

Back in May, there was an election in London for a new Mayor.

One of the flagship policies of the Labour candidate was a ‘hopper ticket’ — bus users would be able to change busses as many times as they liekd with an hour.

This is a big deal for people on low incomes on the outskirts of London. Clear policy to try and address that.

The policy actually launched a couple of weeks ago, but in a slightly in the different form.

Just after the election, Transport for London released a statement saying that, in the short-term, they could only implement the policy for up to two bus rides.

Reading between the lines, it sounds like TfL couldn’t fully deliver on an electoral promise because they couldn’t change the code, and the candidate and the press couldn’t know this during the election.

I imagine that was an interesting first meeting.

Government policy is increasingly expressed in code like this.

So, if politics is about this distribution of power in society, software is now politics.

The decisions of designers and developers are a political force of their own. So this is what I mean when I say I think you work in politics.

And we are asking users to trust us with more data, to allow code to make more decisions for them all the time.

You are in this bubble!

Now, it’s possible to see all this in a negative light. Run for the hills and disconnect! Reality or nothing! But I think we have reached a really interesting point from a design point of view.

And it’s this:

If you want your users to trust you with more data, and make more decisions on their behalf.

If you want users to start trusting their data in your machine learning system.

Or if you want users to trust your device in their house.

This is the fundamental design decision facing people building digital services right now.

It’s time to stop designing digital services to be easy to use and to start designing them to be understandable, accountable and trusted and easy to use.

Or to put it another way: ‘It just works’ is not good enough anymore.

So, we need to figure out how to wire these attributes — understandable, accountable, trusted, into the services and institutions of the digital age.

It’s time for some new design patterns.

Incidentally, we’ve been here before: the manufacturing and industrial revolutions led to new institutions and practices that took the work of entrepreneurs and put it to work for society.

There are four areas I think deserve our attention:

Can we make accountability and transparency part of the design of services?

This is a hypothetical service for starting and managing a company. Now, obviously the service should be designed around user needs, and users should not have to understand the structure of government to use it.

But that is not the same as obfuscating how government works.

So, what if you could understand why that service is the way it is directly from the service? For example which government ministers are responsible for its safe running?

How the service is performing?

What is the underlying legislation for the service? Why does it exist and how can it be changed?

What definitive data sources does it use?

This is a government example, but what would a commercial equivalent be?

Maybe making it clear exactly how much was paid to an uber driver and if it meets living wage levels, directly on the email receipt?

Or making it possible to understand supply chains and environmental impact when buying a product from Amazon?

Or display government food safety inspection data next to a delivery order from Deliveroo?

Could they do all this in the app, at the point of use?

In fact, Google has started showing some work in this direction with Google News explicitly marking fact-checked articles.

This is only going to get more important as superficially good design abstracts how things actually work further away from the user.

So maybe you should be able to ask Alexa “did the people who assembled you have the right to paid parental leave?”

As we saw with the bus-fare example earlier, more government and non-government services are expressed in code. The code is the definitive article and wanting to change policy requires an understanding of how the code works.

So, what mechanisms do we have to understand how the code works?

One way is to obviously examine the source code directly.

Luckily the UK government increasingly does open its code.

I was part of the team that drafted this a few years ago — it’s the Digital By Default Service Standard, and it includes a requirement to open up the source code of new projects.

Other governments have since made similar commitments.

Now, obviously not everyone can read code — although journalists and consumer rights organisations probably do need to get better at it — and there are many circumstances where organisations will not want to release their source code.

Maybe we can look at other tools of the software development tool-chain to help expose the rules.

If services published their tests, would it help people understand how they work? After all, Gherkin syntax is designed to be understandable by non-coders.

Here are a couple of examples that explain the rules about free prescriptions in the UK:

Could the publishing of software tests help politicians make better promises?

As software agents of one sort or another (bots, digital assistants, news feed algorithms) start to make more and more decisions for us could publishing of software tests be useful for making bots more transparent?

You can’t ‘view source’ on Siri or Google Now.

So, might you get things like this.

A couple of years ago, I built a proof of concept bot called Habitat to test this idea — it used Gherkin tests as the user interface.

It worked surprisingly well, at least for exposing simple, deterministic decisions.

But tests are only one way of exposing how code works. We could also see the emergence of software deposits organisations for holding private code and that consumer rights organisations or government inspectorates have the right to audit. Some of this already happens in the gambling industry.

For bots using machine learning, the public publishing of training sets will probably also become important.

The next area is ‘permissions’.

Take a minute to think about permissions. You might think that this is already solved,

Probably the most important research question in digital product design today.

We are now pretty used to apps asking for permission to use our cameras or access our location. But we are not yet used to the idea of different services exchanging data about us, beyond maybe email address.

I’d like to show you an example of how that could work in a government context.

That is not how government works at the moment. And that is only a prototype.

But we are going to have to figure out design patterns for exchanging new types of data in a way people understand.

The other reason this is a hard problem: it is also a moving target because of more data and more devices. We are struggling today with a few devices and data points.

I think this quote from Charlie Stross sums up the problem we face.

And I think there is a parallel here to how the tools we needed to organise the web had to change over time. Back in the late 90’s, a hierarchical list of categories was a perfectly acceptable way to organise the web for people, but as it got bigger we needed better and better search; personalised search; machine learning assisted search.

When it comes to permission systems, we are currently the equivalent of the Yahoo homepage.

The argument here is not just transparency.

Services providing an access history to users, should become an accepted standard because putting users in control of their data is the best protection organisations have against fraud and that users have against misuse.

This is a mockup of a government transparency log, but something like it could equally apply to a commercial service.

Again, I’m not sure we have a clue what design patterns will work here — especially as the amount of data expands.

There is no single answer and this subject needs serious investment by governments and tech sector.

I think Facebook have gone on record to say they want data sharing to become more of a design problem and less of a policy problem, which is encouraging.

Finally, for users to really trust stuff in the digital world, we need trusted 3rd parties to do some of the hard work for them.

And this means giving some elbow room to some new digital watchdogs.

We are all familiar with the idea that new technology resulting in regulatory institutions.

It took this book in the US to change the law around car safety.

In the UK, The Consumers Association was set up to test the products of the manufacturing revolution.

What will the watchdogs of the digital age look like?

Can the tools that we use to develop software become the tools of consumer watchdogs?

Environmental campaign groups might start automatically checking open government data for breaches of regulations.

We could see third parties actively verifying datasets and checking facts.

The approach the GDS is taking to open registers is intended to enable this — they using signed Merkle tree’s so others can verify the integrity of a government register.

So a legal watchdog could independently and automatically verify the integrity of the Land Registry.

Or an app on your phone might verify the food safety rating of the takeaway you just walked in to.

Data validators become advocacy tools when they are checking for compliance with data standards that have an effect on equality in society.

This is a validator for the jobs advert standard that the government has adopted. Try it out on your company.

Single purpose monitoring services like twofactorauth.org and OpenDiversityData seems to be another emerging pattern.

The work going on around software supply chains is also really interesting.

The effort towards reproducible builds of Debian for example — being able to say exactly what set of code is running on a device.

Could using the code scanning tools currently used by companies to identify open source licence restrictions in their codebase could be repurposed to identify if vulnerabilities?

This is going to be increasingly important in safety critical systems — you want to know if the system operating the breaks on your car is the one it is supposed to be. There is an excellent talk from Fosdem on Safety Critical FOSS — Jeremiah Foster, I recommend looking it up.

We need to ask ourselves this question. In a world of Amazon, Facebook and Uber — do we need a global consumer rights org? Who is going to explain all this to users?

So, these are some important areas we need some new design patterns for.

And if you work in the digital industry, you are in this bubble.

Like it or not, you work in politics.

I could go on about ultimate power corrupting and all that. And the moral angle is very important.

But questions of accountability, understanding and trust are only going to become louder.

These issues are only going to get harder to solve as we ask users for more data and to trust code to make decisions for them.

The organisations that understand this and start thinking about how to make services that are accountable, understandable and trusted will have the advantage.

Originally published at blog.memespring.co.uk on November 23, 2016.

--

--

richardpope.org

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store