The Efficacy of Interface Design

Why we have to design interfaces with care and trust

Article based on the talk of Jan-Wessel Hovingh at the World Information Architecture Day 2018 in Utrecht at GriDDs offices.

During his presentation at the World Information Architecture Day summit, Jan Wessel Hoving spoke about how today’s interface designs elicituser trust even over questionable transparency and conscious decision making. Jan Wessel is a designer and lecturer who specializes in designing interfaces for privacy and if there is anything he loves it is user interface design.

A Love For User Interface Design

My love for user interface design stems from the ability of user interfaces to make complex machines and systems usable for regular human beings. In today’s technology-drivenworld, one doesn’t have to know how complex technology works to be able to use them.

More and more we are able to harness the power of computing through user interface design. In a time frame of 70 year we moved from the creation of ENIAC, one of the earliest electronic general-purpose computers, to today’s technologies. ENIAC was developed about 70 years ago to calculate complex formulas for the atomic bomb and missile trajectories and it was a huge machine. It covered 1,800 square feet of floor space, weighed about 30 tons, and took 6-operators just to operate the thing. Seventy years later we have the Apple Smartwatch which we can all operate without even glancing in the direction of a manual all because of user-friendly interfaces.

To put things in perspective, if you’d want to replicate the computing power of a Smartwatch with this technology of seventy years ago, you’d need to get your hands on about three million of these devices. Then you would need about 2,500 Nimitz class aircraft carriers just to power them and it would take up the complete Chicago metropolitan area and you’d probably need every single person thatlivesthere to operate the machine. And now, and that’s why everyone should love user interface design, we can all harness that immense amount of power on our wrists because its user interface is designed so well that you can just start using it. So it is not only about a change in computing power, it is more than anything about how far we have come in interface design.

Ambient Computing

Good user interfaces rarely are obvious to average users and the best interfaces are the ones that you don’t even notice. This is rapidly becoming a norm as most user interfaces are seemingly invisible. In such a way that they become ambient computing, and you don’t notice them and even while using them. For example, your car is not just a car, it is a computer; a computer with wheels but a computer nonetheless. The same goes for your toothbrush, which is a computer used for achieving a very high standard of clean teeth. For so many products and services, you — when you come to think about it — don’t experience using a computer at all. According to Mark Weiser, well known for his publications on ubiquitous computing, the best user interface design is not noticeable. A searching look will reveal computers everywhere but you don’t notice them or experience using them even while using them.

Despite the advent in design and computing, we are also at the point where the experience of using these user interfaces prevails over conscious decision making. So now, you are using a computer, even when you think you are using a consumer product. This has consequences for the experience of risks and contingency while using that product.

User interface design has evolved to provide us with the best possible experience, and will continue to evolve the way we interact with computers, but at the same time we have to be careful.

Consequences of Interface Design

Even the best designs have their flaws and the side-effects of a computer or digital product flaw can be serious. One of the experiences I had personally, was when I was designing an app that connected local farmers that had grocery stands on the side of the road with possible clients, cycling around the countryside. The app was a solid idea with a nice design and user experience but it turned out that the Dutch TAX authority could usethe app toinvestigate people who sell these groceries illicitly to make a little extra money tax-free. The farmersmight get into trouble because their activities were all of a sudden traceable. So in the end, a perfectly fine user experience turned out to be problematic, because we had not assessed the implication of using the personal da.

After doing some research we found out that there are tons of incidents where the use of apps and other digital products has led to some serious side-effects. There is the case of a father finding out that his teenage daughter was pregnant when she was advised by a Target Supermarketto buy a lot of maternity productsafter profiling her other purchases. Another girl, a closet lesbian this time, was kicked out of the family when they found out about her sexuality after she was added to a group of a gay choir on Facebook. These side effects are not intentionally designed into the products but they seem to be unavoidable.

Where’s My Digital Data?

Even though divulging personal data can be dangerous, most computer and digital end users are coerced to surrender a lot of personal data to these digital systems. Here, the user-friendly interfaces redirect the attention from the restraints we should perhaps have. Take for example Facebook, Google, Twitter, and various search engines which request a lot of our data which we give up with very little knowledge of where they may end up. When you come to think of it we’re only vaguely aware of what happens with our information. Consider the almost monthly NSA data breaches, the prison program, data from social media and search engines being shared with big corporations and even the secret service. This is why I worked on design tools that help interface designers to evaluate possible risks on privacy.

Why Are We Not Taking Good Care Of Our Data?

To answer the above question, I had to carry out a little research on an unsuspecting group of randomly chosen individuals who thoughtthey were participating in a smartphone research. Each individual was given a questionnaire of questions usually asked by digital companies and then some more. The goal was to gauge their reactions and experience of the research itself. It turned out that people are not as eager to share information with someone they know or have access to as they were to share information with large companies. Even when the respondents were aware that their data would only be used for this research they were less likely to share them.

The common reaction was that they saw the act of supplying data to these digital platforms as a natural process. Some admitted that they were horrified by the researcher asking these questions and not at all had the same feeling when asked by a Google or Facebook page. So it seems like these digital companies, platforms, and interfaces have led us to trust them and not question certain their systematic processes. After the research experiments, some respondents became aware of their privacy and took steps to protect it but they couldn’t sustain that behavior. There was lots of lingering cognitive dissonance that encouraged trust both in the devices and in the system.

Trusting the Interface Design System

All human beings need trust in order for us tolive in a normal way. We can’t know everything and therefore we have to trust certain things to be able to interact with them. For example, when we get into a car and drive along the freeway, we just assume that nothing will go wrong. We assume that the car is safe, we assume that no one will crash into us, we assume that everybody will adhere to the traffic rules. We trust that our well-designedsystem interfaces will do just what they are meant to do because we don’t completely understand how computers work. We still interact with them despite our ignorance because we trust that the person who designed or built them did a good job.

These smart technologies help us talk to our friends, to our colleagues, be a part of the social network and we trust the technology that connects us. We don’t think about the functionality because they’re designed to be easy to use, with user interfaces that simplifythese networks to a level that we can understand and interact with. However, a lot of these interfaces and apps hide a lot of data about us from and a lot of big businesses are collecting as much data as possible on us without our knowledge or feedback. This gave the inspiration for a design process that does a better job of protecting our privacy.

Trust-worthy Participatory Design

Most people have already formed a good solid idea of how things should work. For example,if we talk about a car everybody will have a mental model of a four-wheeledsteel contraption. They may not know anything about how the engine or the steering rack works or how the wheels are connected to the car but that doesn’t stop the formation of the mental model. In fact, we create mental models of all products and services we encounter. I believe that this mental model of products and services dictates how people perceive them in real life. If this is the case, then the users should be the ones doing the designing, especially when it comes to the use of their personal data.

One of the most popular scholars of this school of thought is Larry Tessler, who designed the graphic user interface of the Xerox PARC. He has become the Godfather of all graphicuser interfaces. Larry used a lot of participatory design when creating the user interface, he literally asked people how they would like to write a letter on a blank screen, without any prototype whatsoever. He later called this the blank screen experiment. In it, respondents would say, when dealing with the new interface for the first time: “well I would point over there” or: “I would hit the delete key to correct,” and he used these cues to design the interface.

Another inspiration is Anthony Giddens, who states that trust can be mobilized only by the process of mutual disclosure. Mutual disclosure, in this case, implies creating a design process that brings all creators and endusers to the table to explain to each other what the surface works like. This would ensure that all parties maintain a process of mutual disclosure and have a shared mental model as a result of its participatory design.

This is the core value ofa trustworthy user interface, where users can for example delete shared date and have full control over their privacy. The lesson we can learn from this is that the only way to have a real trustworthy interface design is to have a real trustworthyinterface design process. Only through open dialogue and a shared mental model will we all know what we’re talking about.

--

--