Fair Trade Data

A future where an ethical use of personal data is labeled

For my Global Design Future forecast I have chosen the topic of the „Digitalization of our physcial world and the future of privacy.“ I asked myself the question: is a world of total surveillance inevitable or are there possibilities to design a delightful user experience?

Global Trend: The Always-Connected Consumer

I started my forecasting with desk research and a STEEP map. I identified trends in different areas like Smart Cities, Autonomous Driving, Big Data, Quantified Self, Wearables and the Internet of Things. A main trend that I have identified after mapping all of those topics was, that the Internet is emerging into our physical world.

STEEP Map: Initial research to identify global design trends.

We will always be connected. By 2020 there will be 50 billion devices which are connected to the internet, Cisco IBSG predicts (Evans, 2011). It will come a time where “every device is connected and we don’t even know it”, says Mikko Hyppönen from F-Secure (Palmer, 2017).

In his book „The Inevitable“, Kevin Kelly (2016), founding executive editor of Wired magazine, gives insights about his thoughts about technological forces that will shape the future. He expects that the internet will always be present, and will feel more like a conversation and experience than a „place“ (Kelly, 2016).

The digitalization of our physical world and the future of privacy.

The internet is currently emerging into our physical world which will have an immense impact on how we live and behave in the future. Related to the digitalization of our physical world (IoT), Kevin Kelly (2016) said, that „everything that can be tracked, will be tracked“.

The collection and use of private data will be almost invisible and unavoidable. When we add sensors to ourselves and to the objects and places around us, we will create a virtual blueprint of our physical world. This trend will also challenge the core values of our society and raises questions about privacy.

Google‘s self-driving cars gather nearly 1GB of sensor data every second. What if we apply the technology of autonomous cars into our environments?

In March this year, WikiLeaks (2017) has published a dozen of documents, which give information about CIA hacking programs. A program called „Weeping Angel“ allows them to hack into Samsung smart TVs and uses them as covert microphones — even if they are „turned off“. (WikiLeaks, 2017).

Another example where privacy of users is violated, is the IoT Device My Friend Cayla for kids: „Researchers discovered that My Friend Cayla is pre-programmed with dozens of phrases that reference Disneyworld and Disney movies. […] This product placement is not disclosed and is difficult for young children to recognize as advertising. Studies show that children have a significantly harder time identifying advertising when it’s not clearly distinguished from programming.“ (Rotenberg, Gartland, 2016, cited in Rosner 2017).

With the launch of Google Home (a voice-activated speaker powered by the Google Assistant) in the UK and Amazon Echo (the voice service from Amazon) we are now able to connect our most private place, our home, with all leading tech companies.

Google Home waits for the signal „OK Google“ to be fully active. All the things you say before that signal aren’t stored or sent over a network (according to Google) and are called ambient conversations. There is also a physical mute button on the back of the device, which cuts off the mic. But can you trust that this is a „Real-Off“ or is it only a Samsung like „Fake-Off“ for the users who believe in the good?

When it comes to the digitalization of our physical world, what kind of indicators need a user to build trust towards a device, brand or environment?

Regarding to Ryan Calo (2013) of the University of Washington, is the problem, that firms write their own rules, with their own goals in mind. „Our things will contain tiny salesmen with a fortitude and patience born of intense corporate resources and economic incentives.“ (Calo 2013, cited in Rosner 2017).

These examples leads to three weak signals:

Weak Signal #1 
Everything that can be tracked, will be tracked.

As Kevin Kelly (2016) said, everything that can be tracked, will be tacked. A world of total surveillance seems therefore inevitable.

Weak Signal #2
Every device will be „smart“ and we don’t even know it.

Every device will be connected. As Mikko Hyppönen, from F-Secure said: „It’s going to be so cheap that vendors will put a chip in any device, even if the benefits are only very small. But those benefits won‘t be benefits to you, the consumer, they‘ll be benefits for the manufacturers because they want to collect analytics.“ (Palmer, 2017).

Weak Signal #3 
Emergence of personalized experiences.

The core aim of environments becoming „smart“ is to provide personalized interactions but at the same time, the potential for misuse of information is massive (Nixon et al, 2005).

With constantly tracking we will create a blueprint of our lives. The produced data can be used for profiling which „can enable aspects of an individual’s personality or behaviour, interests and habits to be determined, analyzed and predicted.“ (ICO, 2017).

Will it become the new normal for us, as always-connected consumers, to deal with surveillance? What about our society’s core values and the right to privacy? Is it still contemporary or do we have to change our values, beliefs, attitudes and behaviors?

The digitalization of our physical world will undoubtedly be useful, but there are potential downsides that we must guard against.

The Design Experiment

The insights from the desk research leads to the next stage of the project. With help of the following design experiments, I was able to generate user insights about the future of privacy in terms of IoT and data collection. To gain these insights, I created two different provocations.

Two Provocations

  • Provocation #1: What if privacy becomes a luxury good?
  • Provocation #2: What if companies have to accept my Terms & Conditions?

The first scenario takes place in a world of total surveillance, where every device is connected to the internet and is able to collect personal user data. The second scenario is more an utopian future where an ethical use of personal data is labeled.

Structure of the Experiment

  • Stage 1: Provocation #1
  • Stage 2: Provocation #2
  • Stage 3: Testing the Future Scenario
  • Stage 4: Final Outcome

The idea of the experiment was to create different scenarios to be able to build on the findings and insights which were generated during each stage.

Stage 1: What if privacy becomes a luxury good?

The first provocation started with the question: “What if privacy becomes a luxury good?”, which arises out of the key findings from the desk research. In this future scenario I wanted to create a world where technology is so cheap, that companies put a chip in any device, because they want to gain data for analytics. The main benefits will be not necessarily for the user but for the companies.

To bring this scenario to life, I created a packaging design for two IoT devices. The first device is a fully connected smart light bulb with Wi-Fi connection, voice control, 360° camera and smart sensors. The second device is an analog light bulb with no internet connection and tracking functions.

Two prototypes to bring the provocation to life and turn the abstract topic into something more tangible.

Both products could exist today but to bring them into the future I put a price tag on both: The fully connected light bulb gets a tag of £2,99 and the analog one of £179,95 (the current price of the Nest Learning Thermostat). I used the price of the product to provoke and turned the supposed cheap product into a “luxury good” of the future.

Starting the conversation about privacy and data collection with two prototypes of IoT products.

The aim of this experiment was, to gain insights about about how people react to this kind of future and what their assumptions are. Is it obvious why the connected device is cheaper than the analog one? Are users aware of why companies want to collect their data and for what they are using it? What do users think about data collection in the physical space related to privacy issues?

I started the discussion with an introduction to the products and their functions. I explained both devices and asked the participants which product they would choose. Most of the answers came up really fast and with a clear statement:

“I would buy the cheap one and turn everything off […] because of privacy issues. I would turn it off unless I want to stalk my kids or my house.” — Woman, 25

Most of the users first only thought about the product and the benefits they will get from it: They compared both products by means of functions and choose the one where they get more for their money.

Questions during the interview about Terms & Conditions and use of data.

After a discussion about why they think the price is so different, issues about surveillance, privacy and imbalance in user and company benefits of data collection arisen. Most of the user know that companies collect their private data but they have no idea for what they will use it. They also want to turn the device off when they do not use it, to secure their privacy. In the end, this experiment leads to the following insights:

Main Insights

  • Users only see the object (light bulb) and its obvious actions (light, voice, camera). They do not understand the context and invisible background actions (data collection and analysis).
  • Users want the benefits from technology but only when they need them. They do not trust the companies in terms of surveillance and privacy.
  • Users feel uncomfortable when they are always connected (with the companies).
  • Users do not know how companies handle their private data.
  • Users want to be in control about when private data is collected.

Stage 2: What if companies have to accept my Terms & Conditions?

The findings and insights of the first stage leads to the creation of the second provocation. Insights about the imbalance between user and companies benefits and the missing trust in how companies handle private user data leads to the question “What if companies have to accept my Terms & Conditions?”.

With the current model of “Terms & Conditions” every company can set up their own individual rules to achieve their goals.

The current model of Terms & Conditions.

To bring the second provocation to life, I created the brand Fair Trade Data. In this scenario, companies have to accept the Terms & Conditions of the users. Users can sign up to Fair Trade Data and set up their individual policies. Companies who want to join the Fair Trade Data community have to accept agreed standards and the policies of the user.

Future Scenario: The Fair Trade Data model.

Experiment

For the experiment I created a fictional letter of Fair Trade Data. The first page gives the user a brief introduction about the service and the other two pages are about the individual set up of their own policies. For this, I identified during the desk research and the first experiment five different areas around privacy and data collection: data collection, usage, analysis, storage & security, sharing.

Bringing the brand to life: Documents of Fair Trade Data.

The participants filled out the questionnaire and at the same time I encouraged them to speak out loud what they are thinking. I also used the IoT prototypes from the first experiment to explain different scenarios and turn the abstract topics into something more tangible. The aim was, to find out what is most important for the user related to the five areas around privacy and data collection.

Main Insights

General concerns for the user related to data and privacy are primarily in the areas of sharing, usage and analysis of data. In these sectors, the users have no influence about how their data is being handled.

  • Users do not trust the way organizations use their personal data.
  • Users feel an imbalance between own benefits and benifits for the company related to data sharing, usage and analysis.
  • Users need transparency and control to build trust towards a product or service.
  • Data is very abstract and not every user understands the complex context.
  • Users appreciate the idea of a common visual sign for a fair use of data which provides orientation and familiarity.
  • It is a statement for companies to show that privacy issues of their users matter.
  • Users concern about how independent Fair Trade Data can be.

Insight #1: About Transparency

Users wish to have more transparency in the areas of usage, sharing and analysis of private data. Most concerns of the users are about “Who is seeing my data?” and “Do they make profit with me?”. The participants are interested in knowing what happens with their collected data and they want to compare how companies handle and share their data, to find out, who is profiting most.

Insight #2: About Control

Users wish to have more transparency in the areas of sharing, storage & security and analysis of private data. Most of the users feel uncomfortable when data is shared between different information environments: For example when the private browsing history is shared with the work environment. Or private data from work is shared with the doctor. Also the right to be forgotten is a wish for users, who want to have more control over deleting their stored data.

Stage 3: Testing the Future Scenario

After the two design experiments I gained useful insights which have informed the two last stages of the project. In this stage, I have developed the Fair Trade Data service further to create and test the future scenario of the provocation “What if companies have to accept my Terms & Conditions?”.

Introduction to Fair Trade Data

To bring the service to life, I decided to create different touch points and confront users with these scenarios. First, I created a labeling system for IoT products, which can be placed on the product packaging or directly on the device:

What if IoT products are labeled?
What if every connected device is labeled?

Second, I created a personalized Terms & Conditions widget for online services like Uber or AirBnB. The service will replace the normal “Accept Terms & Conditions” box with a widget from Fair Trade Data:

What if Fair Trade Data is implemented into the sign up process?

The second widget can be used by online shops like Amazon, to give users an overview about how much the Terms & Conditions of an IoT product follow their individual privacy policies:

What if Fair Trade Data is implemented on shopping sites?

User Testing

To test the different scenarios I placed these elements on websites and products and showed it to different people, to get feedback and start a discussion around privacy, data collection and Terms & Conditions.

Discussing the Fair Trade Data scenario with users.

I started the discussion with a screen of the Amazon website. The participants liked the way that they can easily compare products. They also have a trusted and familiar feeling when they see the Fair Trade Data logo because when they know the brand rules and thus they will know the companies attitude towards privacy and data handling.

The second scenario was about the registration process for digital services like Uber. The participants found this small hint very useful and they would consider not to sign up if a score is very low. At the same time, they would also wanted to view the analysis, why the score is low and consider to make a compromise.

User Emotions

After the discussions I mapped the emotions and thoughts of the participants:

Insights about Emotions during the testing.

A unified Terms & Conditions service and labeling system for fair data handling could influence the purchase decisions of users. At the same time the system can build trust between companies and their users. It would lead to a more balanced relationship in which both sides benifit. The users can determine their own policies and at the same time, companies communicate that they respect the values of the user by committing their demands.

Stage 4: Final Outcome

The final future scenario is a possible future where an ethical use of personal data is labeled. In this future, products and services which meet common standards around data collection, usage, analysis, storage & security and sharing will be allowed to use the labeling system of Fair Trade Data.

Making fair use of data visible: An online and offline labeling system.

User can sign up to Fair Trade Data and set up their own privacy policies. The system makes it easy for users to identify products and services that meet their demands and support their values.

This labeling system creates trust between users and companies, encourages a more conscientious behavior around use of data and privacy. In the end, Fair Trade Data can contribute to a more balanced relation between user and provider benefits.

Terms & Conditions API

Service provider can integrate the Fair Trade Data widget into their sign up process. Users will get a short and concise overview about how much their individual policies are followed and will be informed about how their data will be handled.

The Fair Trade Data widget can be used for digital services.

Companies of IoT products and connected devices can use the widget to show that they meet the Fair Trade Data standards and also how much their product follows the individual policies of the users.

Users get informed at online shops with a personalized labeling system.

Online Platform

On the Fair Trade Data online platform, users can register and set up their own privacy policies which will be used for the widgets to inform the users about the conformity of a particular service or product.

Inspired by user insights from the design experiment, the app can be used for keeping track on accepted Terms & Conditions. Users can compare their products and services in terms of data collection, usage and sharing:

Conclusion.

With this experiment I have gained insights on what people think about the future of privacy in our digitalized world and explored what kind of future is possible.

The Future of Privacy

The aim of the digitalization of our physical world should be to create a seamless experience without violating privacy and trust. This is just as much a design problem as it is an ethical one. With emerging technologies in our physical space, companies gain intimate insights in our lives. It is important to focus not only on the possibilities to use these data, but also about an appropriate way how to act in return.

The Role of Service Experience Design

To avoid a world of total surveillance and use technology to create a delightful user experience, we as Service Designers always have to think about the implications a design will have. The time has come for Service Experience Designers to get very intentional, we have to ask for how data is collected and what it is used for. The following 5 points will characterize the Service Experience Designer of the future:

  • #1 Shape the characteristics of technology (Kelly, 2016) 
    Service Experience Design has the potential to take ethical matters into account and respect society’s core values.
  • #2 Create a balance between user and company benefits
    While creating services and products, designers should have in mind to generate equal benefits for users and companies.
  • #3 Build trust with transparency and control for the user
    One important job of Service Experience Designers will be, to design for trust. „Trust is something people are willing to pay for“, said Kevin Kelly in an interview with Mozilla (2016). „Trust is an unique asset which can not be copied. It has to be generated in exchange“.
  • #4 Be critical and ask questions
    The task of Service Designer of the future will be to question everything. Every detail matters and contributes to the overall user experience. It is important not only to focus on obvious actions but also consider implications of invisible processes (e.g. data usage and analysis).
  • #5 Create valuable experiences
    The overall aim of Service Design should always be to create valuable experiences. A good experience continues far beyond a user interaction and will be about the processes that will happen in the background.

In the end, our job is to design experiences that benefit both our clients and the consumers. As Hector Moll-Carrillo (2015) from Eight Inc. states: „Good design should create relationships built on enjoyment and trust, not suspicion and frustration.“


Bibliography

  • Calo, Ryan (2013): Tiny Salespeople: Mediated Transactions and the Internet of Things. IEEE Security & Privacy, Vol. 11, №5, pp. 70–72 (September-October 2013). Available at SSRN: https://ssrn.com/abstract=2353115
  • Dave, Evans (2011): The Internet of Things: How the Next Evolution of the Internet Is Changing Everything. Cisco IBSG.
  • Helbing, Dirk; Frey, Bruno S.; Gigerenzer, Gerd et al (2017): „Will Democracy Survive Big Data and Artificial Intelligence?“. Scientific American. Available at: https://www.scientificamerican.com/article/will-democracy-survive-big-data-and-artificial-intelligence/. [Accessed 07 Jun. 2017]
  • ICO (2017): Feedback request — profiling and automated decision-making. International Commissioner’s Office. ico.org.uk. Available at: https://ico.org.uk/media/about-the-ico/consultations/2013894/ico-feedback-request-profiling-and-automated-decision-making.pdf. [Accessed 16 Apr. 2017]
  • Kelly, Kevin (2016): The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future. Viking, New York, New York.
  • Moll-Carrillo, Hector (2015): „The Right to Privacy: Four Ways Design Can Protect You in a Connected World“. Eight Inc.. Abgerufen am 10. 03. 2017 von http://eightinc.com/insights/right-privacy-four-ways-design-can-protect-connected-world.
  • Mozilla (2016). 12 Technology Forces Shaping the Next 30 Years — Kevin Kelly at Mozilla Speaker Series. [Online Video]. 13 June 2016. Available at: https://www.youtube.com/watch?v=JEsgxHxIr9w. [Accessed 7 June 2017].
  • Nixon, P. A.; Wagealla, W.; English, C. u. a. (2005): „Security, Privacy and Trust Issues in Smart Environments“. In: Smart Environments., S. 249–270, DOI: 10.1002/047168659x.ch11.
  • Palmer, Danny, 2017: „Internet of Things security: What happens when every device is smart and you don’t even know it? | ZDNet“. ZDNet. Available at: http://www.zdnet.com/article/internet-of-things-security-what-happens-when-every-device-is-smart-and-you-dont-even-know-it/. [Accessed 06 Jun. 2017]
  • Rosner, Gilad (2017): „ OK, Google, You’re Creeping Me Out: Advertising in the Age of Voice Devices “. Medium. Available at: https://medium.com/startup-grind/ok-google-youre-creeping-me-out-advertising-in-the-age-of-voice-devices-87af722d414d. [Accessed 12 May 2017]
  • Rotenberg, Marc; Gartland, Claire (2016): „Complaint and Request for Investigation, Injunction, and Other Relief“. epic.org. Available at: https://epic.org/privacy/kids/EPIC-IPR-FTC-Genesis-Complaint.pdf. [Accessed 16 Apr. 2017]
  • WikiLeaks (2017): „Vault 7: CIA Hacking Tools Revealed“. Wikileaks.org. Available at: https://wikileaks.org/ciav7p1/. [Accessed 06 Jun. 2017]