What Does It Mean to Be “User-Friendly”?
In the February 20, 1988 episode of Saturday Night Live, actor Kevin Nealon, playing an Apple technician, helps a confused customer (played by Tom Hanks) with his new computer. Nealon’s character tries to reassure the flustered Hanks with the following line:
When viewers throughout America heard the word “user-friendly” on that Saturday Night in 1988, it’s very possible that they were as much in need of an explanation as Tom Hanks’ character, who in turn replied:
“User-friendly” usually refers to a few different things. It’s a Marilyn Manson song. It’s a race horse. But the term usually describes a product that was conceived with the goal that you would be able to understand how it functions as easily as possible.
Far from being a new concept, creating and modifying an object to simplify its use has been at the core of human invention forever. Clothes replaced animal skins. Forks, knives and spoons replaced eating with dirty paws. Ball-point pens you can fit in your pocket replaced sloppy quills. Stoves and ovens replaced hazardous fire pits. Lighters replaced flint.
Somehow, the “survival of the user-friendliest” has shaped the world that we know now: a world of efficient, fast, and easy-to-use tools. But what it means to be “user-friendly” in our modern world is open for debate.
First things first
In the tech world, the term usually refers to software and hardware that most people can figure out with little to no prior experience. Apple made a giant leap in 1984 with the Macintiosh, which built on concepts first employed by Xerox’s Star in 1981. The Mac let you manipulate onscreen icons of folders, programs, and files, and used the metaphor of a physical desktop, which made it easier to understand. It was a major change from other computers of the era, which required you to memorize a bunch of arcane commands to actually do anything.
The concept of user-friendliness, when applied to computers, brings the promise of getting more work done, faster and with a smile on your face. It also simplifies onboarding users onto a project or workflow without having to spend countless hours teaching them how to use various tools.
Over the last few years, computers, tablets, and smartphones began to replace a lot of other inventions, like pay phones, filing cabinets, paper maps, books, CDs, credit cards, game consoles, and even babysitters (just give a baby an iPad — you’ll understand).
Whether or not the digital version of a real-life object is easier to use, the fact that people always have their smartphones near them is sometimes enough to justify creating an “app” version of it. And it’s only a matter of time before the only use we have for hands is to use a smartphone.
But no matter how simple a tool is, the user will always need to have the right one for the job, and to properly learn how it works. And in recent years, this has applied to every single user of modern technology.
Do I have the right tool?
Very often, tasks can seem complicated or even impossible if one doesn’t have the right tool.
In theory, in order to determine which tool is the right one, you could list all the possible tools that exist and that can answer your specific needs. You would then try each one of them one by one, and only decide which one feels the most “friendly” afterwards, when you are able to balance the pros and cons of each one.
In practice, however, people rarely use the tools they use because of a conscious, deliberate choice or because they find them “friendly” to use. Trying everything before deciding takes time and money, and external influence often replaces real logic or methodology.
This is true in the workplace, where scheduling, financial limitations and team decisions can prevent an employee from choosing for themselves, but also in the household where budget and time are influencing factors. In both environments, a confusion can arise about what choices exist, reinforced by old habits, contradicting commercial claims from the manufacturing companies, and vague word-of-mouth advice from peers.
In the end, what tools end up in which hands, (whether the user can figure out how to use them or not) can have more to do with Russian roulette than real strategy.
In the short history of computer technology, we have seen tools that are complex to use but relatively cheap take over markets because companies buy them in bulk (which in turn encouraged employees to end up buying the same models for their households). We have seen tools designed to be customized and tweaked end up in the hands of novices. We have seen innovative tools lose user shares because their competitors launched similar products for a lesser price or with larger marketing campaigns.
And tools that are generally seen as easy and intuitive can have more difficulty convincing a customer base because of several factors — delays in manufacturing, a high price, saturated markets… or even fashion considerations are among factors that can come between a product and its audience.
What ends up prevailing is almost always time and money. Especially if you are working with limited amounts of these two things, you might want to make sure that the right technology is in the right hands, and that people can use technology at its fullest. Beyond the historical, cultural or political contexts behind the very existence of tools, the only way to truly find out which one is the perfect fit is through education.
“User-friendly” is an expression whose meaning constantly evolves, and the most important part in the term is “user”. Tools are designed to be used, and their design should adapt itself to the way our brains work, and reflect our natural human processes, and not the other way around. Machines are here to serve us, execute tasks for us without slowing us down, and without getting in our way.
A badly designed tool or system can cause frustration, delays (think about the time wasted closing error messages, having to resume work after being interrupted, restarting a project when part of the work was lost, installing programs, uninstalling them, fixing your system, updating apps, updating systems, backing up, transfering data to a new device…) or even anger, which in turn will impact coworkers, the quality of the work produced, and give a negative experience of technology.
In any science-fiction film, the error messages that appear in the control rooms of a spaceship will instantly carry a sense dread or imminent death. Should one of the operating systems encounter a bug, should the last survivor after a crash have to figure out a complex array of commands and knobs, an intense, suspenseful music would be heard, associated with close-ups on sweat drops. Life is on the line, here, maybe even the survival of mankind. Don’t mess it up.
For me, when the same kind of confusion or technical failures happen here on Earth, even if there are no lives on the line, the drama is just as real. In the virtual world as in space, no one can hear you scream, and the real threats are deadlines, efficiency, time spent working out kinks in your workflow, feeling down, and overall productivity. Time is money. You want to spend 100% of your time working with a smile on your face, making progress on your exciting projects, changing the world, not fixing computers or configuring endlessly.
The more excuses a computer gives you not to do the work you’re asking it to do, the more you’re actually doing the work for your computer.
And this is what the debate around intelligent technologies stems from, because if we forget these simple principles, we can end up in big trouble, when lasers will replace error messages, and where interruptions in your workflow become interruptions of life. Give a “good”, user-centric technology arms and legs, and it will serve everyone and make the world a better place. Give a “bad” technology arms and legs, and it will use them to crush your planet to smithereens.
Until the day where machines actually point their guns at us with the intent to shoot, users will be left with the heavy burden of choosing carefully what tools they decide to promote and use, and which ones they want to leave behind.
If the definition of “user-friendly” can be subjective — mostly based on a user’s previous experience with computers and their own personal education, the definition of what isn’t “user-friendly” is more easy to pinpoint.
Architect Katerina Kamprani has created a few images to illustrate what everyday objects would look like if usability wasn’t taken into consideration. One of the images she created is this hairy glass:
Which raises the following question: if a tool isn’t user-friendly, then what exactly is it? Is it “user-passive-aggressive”? “User-hateful”? Or simply not “user” centric (“Meant to be looked at, but not used”?)
The furry wine glass above looks like an everyday tool that went back to a “natural” stage, or an early design that was considered, then abandoned. If you feel like you work with tools that are similar to a furry glass, move on. If you’re designing tools that will be used and perceived with as much enthusiasm as a furry glass, you need to think a little bit more about the end user. No matter how pleasing to the eye you believe fur is.
And remember: a wine glass that isn’t made entirely of fur, but made of glass, is just a first step towards usability. Without instructions, some people will still break the glass. Some people will try to use it upside down. Some people won’t know what the hell it is.
Article written with macOS Stickies, the first-ever autosave app.
Please make the ♡ below into a ♥ if you enjoyed reading this! Thank you.