Being private in a digital society

Ben Stewart
Caution Your Blast Ltd
11 min readMar 22, 2019

--

In light of the systemic practice of personal data being traded for software services, we ask has there been an aspect of purposeful design that has driven soft abuse of human privacy rights to date? What state have we got ourselves into? And what’s needed to fix things? Can personal privacy be safeguarded and built-in to technology services and viable businesses? Finally we share our insights and experience of building and testing a public transport information platform, without storing or sharing personal private data.

By design or by accident?

The 2018 European Union’s General Data Protection Regulations (GDPR) clearly reflect the basic and fundamental human right to personal privacy. But almost universally amongst western business this landed as a shocking development and an affront to deal with. The many seminars and business workshops I attended on the subject were characterised by denial of the rationale behind the need for change and a denial of wider obligations towards customers in the digital age.

In the lead up to the GDPR taking effect UK companies remained in denial and even pro-EU ‘remain’ businesses expressed hopes that Brexit would allay their new obligations. And now after the regulations are in force non-European companies who don’t wish to recognise their obligations simply refuse to do business with citizens of the EU. We’ve even seen new services launched solely focussing on helping these companies’ block EU users from their web services and products.

Possibly the average Western business has been simply lazy or out of touch with the wider impacts of digital innovation on their customers and others to allow basic human rights to be systematically undermined to date. Clearly, building in privacy for users has been at best an afterthought for many providers of digital platforms, and consequently for many businesses that rely on them.

Of course there are non-digital historical precedents such as instead of doing a mail-drop in every letterbox at a large cost, a significant return on investment could be made by profiling and targeting specific households. This required knowledge of resident individuals and hence gave rise to marketing brokers who could connect businesses to buyers with customer segments using their demographics.

Digital has only enabled this model to be amplified such that the business model has remained the same. Brokers seek information on individuals from any source they can get. They will monitor many sources and purchase intelligence as they see fit to refine their knowledge of individuals.

Many digital product innovations and new services that in themselves have no sustainable business model, have taken root in everyday society simply because they can be funded by on-selling users personal data to these brokers.

On digital platforms it’s not simply that users are forced to be more public with their private information, but rather that they are expected to share their personal details with the service provider. This is the new currency on which an entire service can now depend.

Such digital platforms leverage knowledge of an individual for their financial benefit above and beyond using this data to provide a better product or service to the user. The latter being a universal cloak phrase that is presented to users to shift their focus away from the loss of any important personal rights. These companies usually offer a quality product and service constituting what has to date been commonly interpreted as a “good deal”, in return for access to personal information.

Ultimately there is no accident about where we find ourselves in regards to the erosion of fundamental privacy rights in our society, it has been driven by business and permitted by governments. The business model is proven and highly scalable, consequently it has established some of the most valuable corporations in the history of the world within the first decades of the 21st century.

Recognising the state we’re in

Now in 2018, thanks to the likes of Edward Snowden, GDPR, and groundbreaking investigative journalism from newspapers like The Guardian, some of the many abuses taking place with people’s private information have been exposed. The public conversation is now maturing, and many consumers are becoming rightly concerned.

Since 2012 our company, Caution Your Blast Ltd., has taken a view that this erosion of human privacy rights is hugely damaging for society and we have tried to make time to change things. Our concern grew from the fact that communication underpins what it is to be human; to experience life, learn, share, work, and entertain, we must communicate. And that personal communication must include a continuum from totally private to fully public in order to serve our unique communications needs and our unique identity. In other words, an individuals choice not to communicate at all, or to restrict with whom they communicate, is part of what makes a person unique and an important part of who an individual “is”.

But it also isn’t an option to “opt out” of communication platforms. Once family, friends, or peers start using a tool to communicate it becomes essential to use it too. The internet as a tool both connects people directly for communication, but also indirectly through the world of information. And in the act of using it further shapes and defines how individuals make sense of themselves, changing their preferences and behaviours.

With even these most basic uses of the internet, this personal activity is now monitored by infrastructure providers. Providers make money out of on-selling the personal data they handle: Internet Protocol (IP) identifiers along with unencrypted messages linked to them (searches for information, messages, and files uploaded and downloaded). All of which identify and categorise a household if not an actual individual within that household, providing the raw elements for making good money as a broker of personal data.

This activity ramps up significantly with product and service providers, the likes of social media networks, mapping services, game makers, et al. Almost universally the free versions monitor personal activity and store this history. For many this is the only commercial gain possible, no other business model stacks up, as perhaps unsurprisingly it turns out that revenue streams are difficult to find in the hyper-competitive marketplaces of the internet and mobile Apps.

Not only do these providers have access to private one to one communications of their users, they seek to go further: to know what a person keeps to themselves. To know what is fully private, the polar opposite of being public. To know what we think and feel.

These are high stakes that strike at the very heart of human rights and identity, particularly as we head into an increasingly technology-rich future across the globe.

Making things better

So how can we take steps towards fixing this whilst retaining the capability of digital technologies to improve lives and increase democratic and fair access to services?

It is easy to trace how national communications systems became largely trustworthy in democratic societies by the end of the 20th century through iterations of regulation and legal action. But in the 21st century, international borderless systems based on cheap digital technology have changed this landscape and now new providers have established vast global communications networks with little regulation or legal precedent for how they should operate. In this context the EU’s exemplary lead with GDPR is welcome and a timely nudge that we have the power and opportunity to redress the status quo.

However providing full privacy relies on the service provider going to extreme lengths to make certain that their user is anonymous on i) the internet, ii) to their 3rd party partner service providers, and iii) to their internal staff. All of this is very difficult and expensive. And why go to this much trouble and cost if you can apply the “just-between-us” model of privacy? That is, simply saying as a business you wish to commercially leverage a user’s information but it will be ‘just between us’. People are used to the idea of trusting one other person after all, and it turns out that a strong commercial brand can also build a similar trust.

This trust translates into product adoption much more easily if the user can be given something for free. This is the basis of the west coast freemium business model made famous by Silicon Valley startups and scale-ups and it relies on the inherent value of a user’s personal information to make the model viable. Increasingly across the globe it is hard to compete with the many companies that give their products and services away for free, including many with very high quality experiences. But free in terms of money, not in terms of privacy.

The success of these services has shown that most people are prepared to allow a commercial organisation to be a “co-owner” of their private information, knowing their private most intimate details. Possibly this is a short term aberration in social growth, a classic human mistake in reacting slowly to the specific challenges of a new environment, a blindness from being overloaded. But of course a number of people will always be trusting types, it’s a personal characteristic too, meaning there will always be the potential to exploit these people regardless of wider societal awareness.

We know that total privacy is an important need for many people and that providing this choice is paramount in all “good” services and products. But is this realistic? Technology itself of course is not the problem — it’s what we choose to design that becomes an issue or creates an opportunity. And in the case of privacy it is clear that this design need is being compromised.

Taking a closer look at design we know that to get from concept to market there are at least two logical phases: a first phase of innovation where concepts and ideas are proved to be viable and then a second entrepreneurial phase to achieve user and marketplace adoption.

Privacy is routinely compromised by design in this second entrepreneurial step that also requires finalising a business model. Here it is becoming harder and harder for organisations offering software-based services to fund themselves through charging the customer. Regardless of how much of a breakthrough a new product or service is, the average person shuns paying for something unless the value is either tangible in the form of a physical product, or exceptionally unique and extraordinary. People won’t pay for Google search, simply amazing technology that indexes the entire web and returns results in microseconds, or pay for Shazam a breakthrough technology that can identify the music it hears. They will however, pay for Spotify, a service that provides on demand access to a large proportion of the high quality recorded music in the world. This line between what can successfully be charged for or not charged for is exceptionally difficult to define.

Entrepreneurs can’t afford to get this decision wrong as their go-to market model will not be easy to change after launch. But that’s ok as this gamble of charging a fee and getting it wrong can be entirely avoided by instead giving things away free. However this leaves the entrepreneur with one problem: how to pay for the service? And this is where selling users personal data easily slips in as the entrepreneur’s friend: easy to implement and with a ready market of buyers, it’s easy money.

Recognising this causal driver tells us that to begin designing socially beneficial transformative products and services it is clear that intervention is required. It is specifically this phase of going to market that importantly requires a step change in maturity, with possibly regulation and close monitoring of business models and customer-related activities. Potentially all digital services and apps should be required to publish what part the end user plays in their overall business model.

This said, it is equally important not to regulate early innovation, the first design step. Innovation needs to be largely unrestricted and will serve society best with minimal regulation in order to allow people to explore the edges of our world, both real and created.

Clearly the market will not regulate itself. By shaping and guiding business models of digital services in the go-to-market phase we will increasingly see innovations that put personal data privacy at the core of their idea. And I’m inspired by initiatives such as https://datproject.org/ enabling the decentralised federated web, similarly https://dadi.cloud, and also https://puri.sm that is founded with a mission to provide people with computers and phones that will protect their privacy. For each of these few exemplary leaders who care about good business we have literally thousands who are simply after easy money, some who are now financially bigger than many entire nations.

Postscript: Privacy in practise

The data privacy issue became a focus for us while working with a partner organisation, Ayoupa Ltd., on the provision of realtime public transport information.

We identified a user need for fully private ways to plan travel on public transport in the UK. At the time, users could get itineraries, directions and realtime transport positions, all of which are brilliant — but only if users gave their personal data in exchange. In particular, providers of this information had built their business models and their technology in such a way that full data privacy could not be offered let alone guaranteed.

We set about creating an alternative service to these models in order to build in data privacy by design. To achieve this socially responsible service whilst being competitive in performance led us to a design decision to decentralise the route planning service giving users the capability to answer their own route queries (options on how to get from A to B) with what the timing and disruptions were in real-time.

To implement this design we employed the end user’s mobile device to provide processing power in order to resolve complex routing queries. With this approach:

  • The user’s personal data is processed on their own device and not shared with any of our network systems or with any transport information provider.
  • The central server overhead costs of processing the routing queries is eliminated entirely.
  • The search query can be fulfilled even while offline with times gracefully degrading to timetable information.
  • Entirely new, unique features and functionality are now possible such as continuous monitoring of public transport route options whilst moving.

The approach was successful and we now have the technology to guarantee personal data privacy to travellers as they make queries and monitor progress on public transport journeys.

Our business model options are many and desirable as they extend from being a trusted provider. It turns out that building in full personal data privacy can be done by design very successfully.

We are also excited by how this approach neatly suits the vision of Mobility as a Service (MaaS) where orchestration of transport happens around the user’s needs, not as today where the user organises themselves around transport providers. The promise of MaaS is also an opportunity to rethink how the inherent human right of full data privacy can be built-in by design.

The fact that we — a small privately owned business with under ten staff — could bring this model to market demonstrates that fundamental human rights such as data privacy by default can be core to services of the future that are underpinned by technology.

You can download this product: Commuter for the Apple iPhone on the App Store here [update since publishing: it has been brought to my attention that the App has not been maintained and the routing engine is not functioning as a result – please contact me if you are interested in the design and the capability]

--

--

Ben Stewart
Caution Your Blast Ltd

Design for human activity over mechanistic process, working for a sustainable future, founder: www.cautionyourblast.com