Ease of use / use of ease

Jan Wessel Hovingh
14 min readAug 20, 2017

--

What UX designers could learn from car crashes.

<TL;DR>

User Experience (UX) Design within the realm of interactive, web, or application design mainly focuses on creating not so much an interface to operate a machine but more an emotional and unconscious experience that guides, even coerces the user in completing a task. Ease of use is paramount. Moreover, ease of use should be translated into a positive experience by designing interfaces that make operating the devices and services they mitigate, ‘feel good’.

By hiding the complex inner workings of a device or service, most interfaces will make a user less aware of the possible risks the use of that product may have. The most important risk here is infringement of privacy because these devices and services can record, store, aggregate and share a lot of personal data.

Many companies, non-commercial organizations and governments make good use of this data; be it for commercial use, law enforcement or counter terrorism. Services and devices are even intentionally designed to coerce its users to share as much of this intimate data as possible, because accurate personal data is extremely valuable.

In other words: Ease of use becomes use of ease.

Therefore, in most cases, UX Design is often not so much about enabling, informing and warning a user, but more about assuring, soothing and coercing to do what is best for the business model. In short: what is now considered good UX Design, is sometimes bad for privacy awareness.

This poses a dilemma, but at the same time it could provide new opportunities.

With events like the disclosure of PRISM and the data leak at Yahoo, the awareness of privacy risks and the demand for transparency on the use of data is increasing, but the implementation of proper global legislation remains difficult. Data protection and privacy is possible through a variety of technical solutions, but most are difficult to grasp and hard to use for the average user.

By using their skills to raise risk awareness and giving users insight, UX Designers can educate users on the importance of transparency in the use of personal data in digital products and services. Moreover, they should provide ease of use in security technology. This may lead to a new design language for UX design, in which transparency about stored and used data will play an important role.

Introduction

My first car was a 1988 Saab 900 and I loved it to bits. Literally. Mostly, I could tell by the rattle or hum what bushing or bearing had gone. It was fairly easy to maintain on a do-it-yourself basis. By the standards back then, it was ahead of its time and you could see that the designers had put a lot of thought in usability and safety: it was built like a tank. Everything about it was built to keep the occupants safe in case of a head on collision. Nonetheless, the technical basics were fairly simple, easy to understand for someone with basic technical knowledge. It was –and I think still is– as transparent as technology can be.

Cutaway of a Saab 900 T16S, 1988 — Illustration by Rony Lutz

But the old Saab could not be more different from present day’s cars. For example, the big touchscreen in a Tesla shows that cars have become more or less computers with wheels. Massively complex computers with wheels. Virtually everything in the car is operated and controlled by an internet connected computer. But what is truly fascinating, is that it does not take a PhD in Computer Science to drive a Tesla. The only reason why it does not, it is because of the car’s intricate interfaces.

It’s all about the interface

Well-designed interfaces make a highly complex vehicle like a Tesla usable for an average human being. Through visual, audible and tactile feedback it informs, confirms and assures us, so we can concentrate at more important things. Although we think we are driving a Tesla but in fact we are operating a highly complex machine. We still experience a car, but we actually operate a computer.

Tesla Model X Dashboard, Image source: Motortrend

Computers have become extremely fast and small last couple of decades. The computing power of an average 2017 smartphone exceeds that of a 1991 supercomputer and we produce more data in two years than there has been produced in the complete history of mankind. But there may be a catch. As we move into an age of complete technological immersion, it becomes harder than ever for humans to grasp the complexity of technological infrastructure, power structure and business models of the products we use.

The very principle of sound interface design largely depends upon translating this technological complexity into a level of complexity the intended user is able to grasp. But in doing so, that very user becomes less aware of the underlying technological complexity and the possible risks it may pose.

The art of reducing friction

It is important to observe that designing digital interfaces is actually a form of industrial design. We design products that are to be used by humans. In an attempt to describe what good industrial design is, the late Victor Papanek refers in ‘Design for the real world’ to Henry Dreyfuss.

When the point of contact between the product and the people becomes a point of friction, then the industrial designer has failed’

Dreyfuss, Designing for the People, 1955

This quote describes the very essence of proper interface design: it is about reducing friction. A product should be easy to use and if the product itself is too complicated or too cumbersome to be operated by an average human being, it needs a well designed interface. Preferably an interface that is so clear, anyone should be able to use it immediately. This is why it does not take a lengthy manual to grasp the basics of an iPhone, nor does the use of Facebook require an eight week training course. This principle of ‘onboarding’ (Cooper, 2014:244) or “First Time User Experience” is at the very core of proper digital product design.

Ease of use

Someone who explains the principles of proper interface design very well too, is Steve Krug. While mainly focusing on web design, Steve Krug advises to design and build interfaces in such a way, so that navigating the interface becomes an intuitive task and not so much a cognitive one. A good web page should be be “self-evident. Obvious. Self-explanatory.” (Krug, 2006). This very principle is still the starting point of many Interface; Interaction and User Experience Designers.

Furthermore; Krug’s and Dreyfuss’ views both align with the principles of Ubiquitous Computing Mark Weiser stated first three decades ago: A human does not necessarily desire to operate a computer. A human wants to reach a goal, to complete a task. The computer should help that human completing that task.

Weiser was visionary: he predicted the use of smartphones and tablets 20 years before the introduction of the iPhone. He foresaw the widespread implementation of computers in his principle of Ubiquitous Computing. Weiser, tended to depict a somewhat utopian future. In one of his talks, Weiser stated some principles for what he called Ubiquitous computing (Weiser, 1996):

• The purpose of a computer is to help you do something else.
• The best computer is a quiet, invisible servant.
• The more you can do by intuition the smarter you are; the computer should extend your unconscious.
• Technology should create calm.

Calm technology?

Although these ideals may have provided powerful foresights and inspired many around the world, present-day reality does not seem to match Weisers ideals. While computers may have enhanced our lives and replaced many tasks that used to require tedious, repetitive manual labour, they now have also become cause of a great deal of stress: we seem to have developed a severe addiction to them.

But more important, and in contrast to what Weiser stated: doing things mainly by intuition does not necessarily make us smarter. On the contrary, as early as 2008 David Carson stated in his somewhat dystopian essay “Is Google Making Us Stupid”:

“As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation.”

Carson, 2008

Apparently something is very wrong with the things interface and UX designers, think they are doing right: by reducing friction through user friendly interface designs, interface designers were dumbing down people, by ‘not making them think’, we are making people too dependent on technology and make them behave anti-social.

The contextual borders of data

I considered this fact a challenge. In 2015, I decided to run an experiment by building an app with my students. We would devise an online service that would require putting some effort into a task in order to see if that would lead to an improvement in the experience of effort and value.

In short, the service would be as follows:
Together with my students, I devised a mobile app that would plot local sellers of groceries on a map and make a cycling route based on the ingredients of a certain recipe. While using the app, people would have a workout, consume local products and meet new people. In the end they could replenish the burnt calories by preparing the recipe and enjoy a meal.

Picture: an unmanned booth for selling locally grown food; Inset: screenshot of the app

Upon finalizing the app, we were confronted with a very unpleasant side-effect. The stored GPS data of the sellers would reveal the fact that those people were making money by selling groceries. As the tax rules in the Netherlands are quite strict the definition of on running a business; this would give the Dutch tax authority a very useful tool to collect some extra VAT and income taxes.

While developing the app, we had unintentionally provided a user friendly way of collecting and storing personal data. Personal data was necessary to make the app work, but that data could also be used in different contexts with very different (and negative) outcomes.

Obviously, this did not feel good.

What Privacy really is

This phenomenon, where personal information willingly or unwillingly migrates from one context to another, can cause a thing called ‘context collapse’. It occurs when information and/or behavior in a particular context of an individual’s life ends up in a different context of that individual’s life. (Jessica Vitak, 2012) As the experiment “Superstream Me” showed, context collapse can lead to awkward situations at least, but unpleasant and dangerous situations at worst.

In this experiment, two young journalists recorded their lives on camera for two weeks straight, 24 hours a day. In the end, one of them came close to an actual nervous breakdown: The fact that he was consciously living in all of his different personal contexts all once, turned out to be emotionally suffocating.

Use of ease

Privacy is not just about hiding things, or having and keeping secrets. Privacy is about having the right and ability to ensure your that personal, intimate data does not end up in contexts where it may be of annoyance or harm to you. Helen Nissenbaum calls this way of looking at privacy ‘contextual integrity’. (Nissenbaum: 2002) But the problem is that we share so much personal data now, it has become very hard to keep track of it all. We mostly find out when it is already too late, with sometimes disastrous consequences. The hack on the dating website Ashley Madison proved that these consequences are sometimes too much to handle for a human being.

So a leak or hack is bad enough, but biggest challenge are the current business models of many Big Tech companies. These models are based upon ‘mining’ and storing personal data and the usage of that data for different reasons than their users think they signed up for. The interfaces these companies provide for their services are so well designed that most users are completely unaware their data is being harvested. It is even worse: these services are intentionally designed to gather as much personal data as possible.

This business model has been called Surveillance Capitalism (Zuboff, 2015), and apart from the commercial argument, through various post 9–11 laws like the Patriot Act, many governments make good use of all this stored data as well for intelligence and counter terrorism purposes.

The dilemma of good UI/UX design

So in terms of UI/UX design, this is the dilemma: By doing our work to the best of our abilities, by converting complex technology into usable tools we indirectly help make our users dependent on technology and subject to data harvesting. So you could argue that interface designers unwillingly may have become some kind of of drug dealers.

That is not a role I would want to be in.

But what to do? The organizations who provide UI/UX Designers with work, the ones that pay invoices and/or provide salaries, are currently the ones that mostly benefit from business models based on data harvesting. Whether you work for Google or Facebook, or you do freelance jobs for such firms: biting the hand that feeds you does not seem the smartest thing to do.

But Mike Monteiro thinks you should.

Mike Monteiro at Webstock 2013

In short, Monteiro states: Unethical design is not a thing you should want to make your money with. Blake Watson thinks you should too. He left Facebook for it. Tristan Harris left Google to start timewellspent.io Or Aral Bakan, who vehemently battles the tracking cookies embedded in most commercial websites by providing his better.fyi app for free. These are just a few, but the number of ethical and conscious designers is rising.

But what about the profession of UI/UX Design in general? Are we to decline assignments that seem unethical or morally questionable in terms of privacy and user manipulation? Would that not amount to digging our own graves? How can we convince our clients and managers to take a more ethical approach?

Well, it turns out that safety, transparency and privacy may be a very good starting point for design, and for that we need tot take a look at the car industry.

What we can learn from car crashes

There are parallels between the way the car industry in the first half of the 20th century and Big Tech of today build and maintain their products. Both the technology as the industry were at that time relatively new and both had a skyrocketing amount of customers and profits. Some laws on safety were in place, but certainly no international ones and many products were built to a price: company interest prevailed over customer interest. The Chevy Corvair, for example, had lethal shortcomings because cost cutting and styling prevailed over safety. (Nader, 1972:40)

So let us go back back to the old Saab. In 1988, these cars came standard fitted with three point seat belts in the front and the back seats, including a warning light on the dashboard. More important: the very construction of these cars was aimed at passenger survival in case of an accident. At that time, this was not yet required by law, but safety had always been an obsession with the Swedes. 30 years earlier, in 1958, Saab was actually the first company that fitted seat belts as standard equipment. Volvo, the other Swedish car brand, offered their patent for the now famous three point seat belt free of charge to other car makers: “The decision to release the three-point seat belt patent was visionary and in line with Volvo’s guiding principle of safety.”

Nils Bohlin, the inventor of the three point seat belt.

It is important to stress that vehicle safety at that time was something most car manufacturers were not really interested in. The now famous book “Unsafe at every speed” by Ralph Nader illustrates how US car makers actively resisted vehicle safety regulations with the argument the public would not be interested in buying such cars (Nader, 1972:114). In fact, Nader states: the companies –GM in particular– feared the public would be constantly reminded of the possibility of an accident and thus would be reluctant to buy such a car. (Nader, 1972:115)

But as time passed, it became clear that seat belts were saving lives. Partly because of the fallout of Nader’s book, governments –the US Government in particular– started to support seat belts and later, even make them mandatory. It was not until 1991 that new cars had to be fitted with seat belts front and back, but since then vehicle safety has increased continuously, and fast too. We have grown to consider it completely normal –even mandatory– to travel in cars fitted with more than 10 airbags, ABS LDS, TCS, etc, ect. Safety has become the norm, even a competitive edge in cars.

If there is a market for safety, there is one for privacy too

This exactly shows the true power of safety by design. The principle of seat belts was intended to stop people flying out of cars in case of an accident, and while only a few companies started using them right away, the seat belt gained momentum. As more people realized you actually could survive a serious crash when using seat belts, it even gave brands like Saab and Volvo the reputation of producing the safest cars money could buy. One design choice built a reputation and paved the way for legislation.

Volvo Advertisement from the 1980'ies

These brands still carry a reputation of being extremely safe, even though many other car brands have surpassed them in Euro NCAP-scores now. Regulation and law making enforce safety regulations in the car industry and that is why we have an all time low in traffic related deaths, even though there have never been more cars on the roads than ever.

Apple, the Volvo of Big Tech?

From all the big tech companies, it seems that Apple is trying to do the right thing. Their concept of ‘Differential Privacy’ is at least an attempt to go into the right direction. Last June, Apple announced that Safari would ship with “Intelligent tracking prevention”, a feature that “reduces cross-site tracking by further limiting cookies and other website data.” Since their battle with the FBI over the unlocking of a terrorist’s iPhone, they seem to be at the forefront of user privacy protection, but they are not there quite yet: in order to comply with regulations in China, they removed all VPN-providers from their China app stores. Apparently every principle has its price.

But still: If Apple focuses on user privacy, so can we. And we should. As of now, laws are in place that should help to secure user privacy and prevent data leaks, but there is more to it than simple compliance. The European General Data Protection Regulation (GDPR) will come into effect in May next year, so we have got our work cut out for us if we at least want to comply to those regulations, let alone adhere to its underplying principles.

So I am wondering:

What will be the three point seat belt of UI/UX design?

Sources used/Further reading:

Cooper, A., Reimann, R., Cronin, D., Noessel, C. (2014) About: Face 4. Indianapolis, Indiana: John Wiley & Sons, Inc.

Papanek, V. (1971). Designing for the Real World. Chicago: Academy Chicago Publishers.

Dreyfuss, H. (1958). Designing for the People. New York: Allworth Press

Krug, S. (2006). Don’t Make me think (2nd edition). Berkely, California: New Riders

Nader, R. (1972). Unsafe at any Speed (Updated). New York: Grossman Publishers

Zuboff, S. (2015). The Secrets of Surveillance Capitalism, Journal of Information Technology (2015) 30, p. 75–89 Acessed 06–06-2017
Link

Vitak, J. (2012) The Impact of Context Collapse and Privacy on Social Network Site Disclosures. Journal of Broadcasting & Electronic Media, 56 (2012) 451–470 Link

Nissenbaum, H. (2004) Privacy As Contextual Integrity. Washington Law Review 79:xxx (2002) 101–139

--

--

Jan Wessel Hovingh

Context collapse: Designer, lecturer @nhlstenden bassist, cyclist, car mechanic. Works on interface design for privacy @intimate_data.