The opportunity in collective action
Since 1939, Citizens Advice, with the support of over 21,000 volunteers, has helped some of the country’s most vulnerable citizens with issues around housing, debt and employment. In one year alone, they provided advice to over 2.7 million people and last year had 43 million hits on their website.
Their policy and research team therefore holds some deep insights on consumer needs and behaviours and have the ability to spot emerging problems. One of the areas of growing concern for Citizens Advice was personalised pricing. So towards the end of 2017, we ran an exploration workshop with them to learn about personalised pricing and the reasons why it is a priority issue for them.
What is personalised pricing?
Personalised pricing is when consumers are charged different prices for the same thing based on their ‘price sensitivity’. The practice of changing the price of the same product on the basis of someone’s willingness to pay is also sometimes described as ‘price discrimination’.
There is also a distinction between dynamic pricing and personalised pricing.
Dynamic pricing is like supply and demand — where companies track a level of interest in a product or service — like when looking at flights. Surge pricing like Uber is also dynamic pricing.
Personalised pricing is much murkier — it is about what you are willing to pay. Companies provide a price that is tailored not just by trying to segment different customer groups, but using data from a range of different sources to try to target the price they charge you to be precisely what you are willing to pay.
Why should we be concerned?
The way people engage with markets is changing. There has been a 45% increase in the number of people over the last decade who shop online — people are more likely to be online shoppers than they are to use the internet as social media, or to use it for reading the news. It’s by far one of the most popular ways people engage with online services.
Citizens Advices’ research has found that there is essentially a loyalty penalty in consumer markets. Companies take advantage of people’s loyalty, and consumers in vulnerable positions are more likely to be loyal . They found, for example, that only 12% of energy customers in the lowest income brackets are on the cheapest tariff and 74% have never switched. In contrast, 70% of the highest earners are on the cheapest deals and only 29% have never switched.
Personalised pricing is similarly likely to be unreasonably discriminating against the most vulnerable consumers and hitting them the hardest.
We are already seeing big differences in outcomes, big differences in fairness, depending on how savvy and engaged consumers are in the markets they’re purchasing goods and services from. Companies will have the data, they will enhance sophisticated pricing strategies, they will have a knowledge of our biases and our inactivities and have the opportunity to take advantage of that.
What’s the scale of this?
We know that the cost to the consumer, is £23 billion (this is a figure from Citizens Advice research). But this figure only represents self-identified cases (where they know they have been ripped off). Price discrimination is also hidden. Our research into the nation’s digital attitudes and understanding found that almost half (47%) of the population, have not noticed that prices vary when they search for them online. The cost is therefore likely to be a lot higher.
The consumer experience
During our workshop we used mapping as a way to understand people’s experiences. We focussed on the essential markets (telecoms, water, post, energy). This is an important distinction because a lot of current research and awareness of personalised pricing has focussed on luxury goods like flights and Uber.
Through the mapping we were able to identify the vulnerable moments for consumers when they are shopping or researching online — from search engines and price comparison websites, through to individual providers who don’t make introductory offers available to existing customers. There are many other types of price discrimination too, such as bank’s offering specific products — which might not be at the current rates, based on your previous purchases and behaviour.
The costs are high, but unfortunately for consumers, policy and legislation hasn’t kept up and it isn’t even particularly simple to do so. These decisions are made by algorithms — how prices are decided happens inside a black box which makes it incredibly hard to work out why prices are changing and what factors are being taken into account. Much of this is not even understood by the companies themselves.
Ultimately Citizens Advice want to publicise the issue, to give a sense to providers and consumer watch dogs that these are risks that are coming up and that accountability is absent. For these reasons, the questions we focussed on in the workshops ranged from how to raise this as an issue to consumers and to policy makers, how to explore where price personalisation is taking place, through to more practical questions of what tools and information could help consumers to navigate it and whether there was a way to shine a light on the ways companies choose their prices.
Tools for collective action
“ Since people don’t have direct knowledge of other people’s experience using products and services, they often can’t tell whether they’ve been subject to algorithmic bias.” — DeepMind Ethics & Society article.
During that initial exploration workshop we came up with the idea of a “Price Collective” — a way for the Citizens Advice network to participate in “citizen social science” — investigating and shining a light on the different prices people were being given for the same products and services. This felt like an interesting way to start exposing discrimination in the system that didn’t rely on individuals needing to take responsibility. Whilst explainability of automated decisions is an important area of work, especially for consumer bodies to be able to hold companies to account, it likely goes far beyond the realm that individual consumers can engage in, especially vulnerable ones.
To collectivise Citizens Advice users to expose where price discrimination is happening, we created a Chrome browser extension, discovered through the Citizens Advice website.
If a consumer chooses to install it, they are asked for some basic information, and then the extension has permission to extract price data from certain sites the consumer is visiting, and then shares it with Citizens Advice.
We created a light-weight prototype of the service that Citizens Advice could run, that collects prices from across the network, empowering the collective to investigate and reveal where price discrimination is happening.
This could then be used to publicly expose where price discrimination is happening and be a first step in holding companies more to account. In an ideal scenario, this investigation and mechanism for exposing discrimination could become more sophisticated. With feedback loops back to participants, that help build trust, people may be motivated to share more data and participate in more in depth ways. Through blogs posts, news and social media, Citizens Advice could update users on their contribution in the investigation and share any ways in which their participation had helped influence change.
In turn, through taking part in the investigation, concerned citizens may also start to group together to investigate issues further on their own. This could lead to the formation of online price watch communities and see more communities organising collective action efforts on their own.
This was an initial experiment to see how a collective response could work in terms of algorithmic accountability, and something we are exploring further with Resolver, with whom we are testing how viable it is as a functioning service and doing more user research to understand people’s motivations for participating.
We want to see automated decisions be explained, justified, and audited, however we want to design ways in which individuals don’t feel a burden of needing to understand if they don’t wish to, and to look for the opportunities where a collective response can be effective in investigating, exposing and evidencing discrimination. As decisions being made by government and the corporate sector become increasingly automated we need new ways to build strength in people power, and new ways for the public to scrutinise power, together. And as Omidyar says in this great report , simple approaches and “exploratory scrutiny” can be an effective starting point in revealing where discrimination might be happening —
“Whilst some more technically sophisticated types of scrutiny, especially in the realm of “black box testing”, are beginning to bear fruit, scrutiny doesn’t have to be sophisticated to be successful. Many of the most notable case studies we identified involved investigative reporting and basic observation of a system’s purpose, policies, inputs and outputs. Such approaches have led to productive public attention.”
Collective action is a strand of work we’re committing more time to over the coming months at Doteveryone, discovering other opportunities and contexts where collective action can play a role in scrutiny, accountability and influencing change. As part of this work we are also looking to civil society organisations to take a role in empowering the public and their audiences to take collective action in directing the impacts of technology on our lives. If you want to find out more or have ideas of where we can take this work, please get in touch — firstname.lastname@example.org
Thank you to Katherine Vaughn, Morgan Wild and Marini Thorne at Citizens Advice for their insights and expertise on price discrimination, and especially to Katherine for making sure this work took shape in the first place.
Also, thank you to Sarah Drummond at Snook and Dan Williams for bringing their expertise to the workshop sessions.
— — — — — — — — —
- Atlantic, 2017, How online pricing makes suckers of us all
- Harvard Business Review, 2014, Data could lead to discrimination
- Breugel, 2017, Big data and first degree price discrimination
Car insurance case study:
- Money Box, 2017, The cost of driving while divorced
- Time, 2015, Why Your Auto Insurance Rate Could Go Up If Your Spouse Dies