AI x Design

Don’t Push the Button!

Every time you push a button, the button pushes back.

Erik van der Pluijm
WRKSHP

--

So many buttons. So much control. It did not help them at Tjernobyl. (Image found online)

Button Zero

Just a century ago, many houses in the city of Amsterdam, where I live, did not yet have electric lighting. The electric revolution was just beginning. For most of the people that installed electric lights, the moment they used the light switch for the first time was their first introduction to the button.

Before that moment, they had never had to push a button in their life ever.

Today, the light switch is just one of the countless buttons we use to control our daily life. It’s very hard to think about going a day without pressing a button.

In fact, because the electric revolution made it so easy, the button has become deeply entangled with our idea of control. Every time you press a button, you give a tiny command. We have started to think about our world and the machines and devices in it as something that has to be controlled, that needs minute by minute input. If you want something to happen, you need to push a button.

In some cases, devices have even started to close the loop: my dishwasher beeps when it is done, and continues to do so until I press a button. In effect, the beeping noise is triggering a command for me to come push the button.

Computer programming has taken this ‘control by command’ attitude even further. Computer programming means stringing commands together in excruciating detail, adding commands for every possible situation.

The last half century, the button overload has helped propel designers that go for minimalism to the forefront. Designers like Jony Ive (Apple) and before him Dieter Rams (Braun) did something extraordinary: they removed all (or most) buttons.

(Still, note that even when touch screens made physical buttons vanish, they only did so by bringing them back as their virtual incarnations, on screens. The paradigm has not changed.)

Anyone that uses their designs has experienced the way in which this reduction in ‘controls’ has made using them seem much more frictionless (or, in some instances, completely excruciating. I’m looking at you, ‘Magic’ Mouse.)

Most radios had way more than two buttons.

The trick these designers have done to minimize the amount of buttons — controls — is, in essence, to remove the amount of control or freedom you as a user have. They aim to give you only those options that you need at any specific time.

The reason this does not feel constraining is that they have really done their homework: they have completely figured out what most people want to do at every step. They have distilled statistics-fueled intuition into an amazingly useful product, and everything in the design is focused on communicating exactly what the user is supposed to do.

In essence, the design of the device or program is training you to use it in a frictionless way. And, because we are so completely used to giving these little commands all the time, we don’t even notice it anymore.

But think about it. What do you need to do, exactly, to use your computer to read this text? And what do you need to do if you want to stop and find another article? The movements you make, the thought process you use to plan these actions has been moulded by the ‘button’ paradigm.

Smarter buttons

Today, when I want to change the light settings in my house, I tell Google Voice to dim the lights. In many cases it manages to correctly command my Hue lights to a lower setting. Google works so well, thatI hardly ever touch a button to change the lights. In fact, when I visit friends, the idea that I can’t ask Google to do it feels strange.

But the paradigm is still the same.

I have to explicitly tell Google to change the lights. That’s not ‘smart’ at all.

In fact, the Zapier setup I have that turns the lights on when the sun goes down gives me much more of a ‘smart’ feeling than the whole Google Voice setup. It is something I don’t need to think about. It goes with the seasons, and changes the time to turn on the lights every day. It is something I have learned to anticipate. It is a quiet, non-obtrusive bit of information that tells me something about the time. And I have only had to tell it once. Something Google Voice can learn a lot from.

Now, there are even ‘smarter’ smart appliances, such as Nest, for instance, that claim to learn what to do. I have a Nest thermostat. I was eager to see how well it would ‘learn’. I was disappointed. Nest learns your routine very well, mind you — if that routine doesn’t change.

Now, my routine does change a lot, so in my case it was pretty much useless, but even when you have a very fixed routine, in order to teach it to Nest, you need to give it a ton of commands. It’s way more work to do that than to setup a schedule. Which, because you have a routine, is something you can actually practically do.

Two examples of AI/ML powered devices, and in both cases the step beyond the ‘button interface’ has not yet been made. The devices simply change the way in which you press a button, but the system is exactly the same. The button paradigm is too deeply ingrained to let it go.

What will change in the next generation of devices and other interfaces is that that paradigm will start to go away. Machines are increasingly able to predict (aspects of) behaviour, and are better able to learn how to respond to what you need at this moment. That is the first step. You won’t have to give so many commands any more — instead of micro managing, it’s more like giving standing orders.

A bigger step forward comes when machines are no longer responding to what you want, but anticipating it.

When you are struggling to fix a lightbulb in a dark closet, a second person in the room will anticipate that you would like more light and either open the curtains or go find a flashlight.

When machines learn to anticipate, that is the moment where you can start to really work together.

What will it be like?

Well, first of all, it will mean a lot less buttons, and less direct ‘control’. And we’ll need to get used to it. The button paradigm has shaped our brains. Every time you push a button, the button pushes back. We are so used to it that it has shaped our thoughts.

The most obvious way I can think of in which the button paradigm has shaped us is that we need to have mental models of everything we control with those buttons, in high detail. You need to understand what you control so you can evaluate which button will get you to the desired result. And you have had to learn to speak ‘button’ to do it. Every moment of the day your brain is generating commands to control your environment.

In the button paradigm, you alone have 100% responsibility for the outcome of your interaction with the thing you’re using. Now, when you compare that with working together with a human (or a even a dog!) that is not the case. You can guess how they might respond in certain situations, but you don’t have to tell them everything. They can anticipate and plan all by themselves.

Peak Button

Thinking back to controlling the lights in my house, I realize that actually, I am not interested at all in controlling the lights at all. I just want to have light to read and do things. And when I’m not there, I want the lights to stay off to save energy. I should not have to tell the system to do this. It should anticipate.

This realization is true for a lot of things we actively control today. Most of the control tasks that we need interfaces for today are the result of a lack of anticipation and prediction.

This decade, we’re at what could be called ‘Peak Button’. Soon, we’ll have the opportunity to start experimenting with systems that predict and anticipate.

Getting rid of our button addiction will not only mean we’ll see a lot less interface. We’ll learn a new way to think as well, freeing us from the need to control and program everything all the time, and give us the opportunity to work together with machines.

And that goes way beyond how we interact with light switches. It will reshape our standard attitude for all systems we interact with. Our expectations of how these systems behave will change. My guess is that the shift in mindset will have wider implications, changing our world view from a mechanistic, micro-management focus into something better. Computer systems will feel more ‘life like’ and less mechanical.

What do you think will change in your daily life because of this?

--

--

Erik van der Pluijm
WRKSHP

Designing the Future | Entrepreneur, venture builder, visual thinker, AI, multidisciplinary explorer. Designer / co-author of Design A Better Business