The Trouble with Friction
Friction in user interface and user experience design is a contentious topic. Almost always, designers seek to eliminate it. After all, it makes sense intuitively that you wouldn’t want to prevent a user from doing the things that they intend to do with your software. The more you annoy a user with unnecessary friction, the more they’ll come up with workarounds or alternatives.
But the truth here has a bit of subtlety is easy to miss: the trouble with friction in interfaces isn’t that it stops someone from doing something, it’s that friction stops someone doing something that they want to do. And therein lies the trouble: determining what someone wants to do on a computer system is an inexact and error-prone process.
A little friction can be a good thing
In physical systems, friction has a way of wearing out parts and causing mechanisms to fail. Otherwise productive energy is lost as heat to the environment. It’s no wonder we use it as a metaphor in computer science and seek to eliminate it. But at the same time, friction is also responsible for the ability to stop and start motion. For things like wheels and pulleys to work, they need friction between certain parts. In other words, friction in physical systems can be useful, but only when exists as a tool and not as a byproduct.
I’d like to posit that not every action the user can take in an application should be equally easy. Instead of being eliminated, friction in a user experience needs to be carefully controlled. For example, if an action is destructive, especially if it can’t be undone, then it’s generally a very good idea to stop the user before they break everything and make sure they realize what they’re doing.
Are you sure you want to delete the world? Y/N
These pauses aren’t perfect at preventing mistakes, of course, but they do at least give the user an opportunity to correct them. In cases like this, most users will, at some level, appreciate that the system is paying attention to what they’re doing and trying to work with them to prevent disaster.
The real balance comes with figuring out where to add friction and where to remove it. The unfortunate truth is that these bits of friction are not applied very well today. Users are prompted for actions which they thought they were being very clear about, and they’re not prompted about things that they wish someone would have asked. This can happen for a number of reasons: the application could be misinterpreting the user’s intentions, the application could be not paying enough attention to the user’s actions, or the user could be accidentally doing something they don’t intend.
Regardless of the cause, this disparity leads to both user resentment and, even more dangerous, user workarounds. Prompt too often, and users stop paying attention to the warnings, potentially missing an important message. Prompt too little, and users could blithely destroy their hard work without recourse.
Easy to do right, hard to do wrong
Another example of well-balanced friction comes to us from the world of video games. A video game presents us with a systematized and simulated world with which we interact. In a good game, the interaction feels easy — friction is reduced to enhance the engagement and immersion of the player’s experience. However, not everything in the game is easy: in addition to the explicit challenges within the game environment that need to be overcome, there’s also the fact that the interface makes some actions easier than others to perform. A well-tuned game will feel effortless to a skilled player, as long as the player is undergoing actions that the designer intends. If the player tries to do something outside the intended interaction — walking through a solid wall or defeating an enemy, for example — then the game is going to push back and make that action difficult to accomplish. When the game pushes back, the feedback lets the player decide whether to adjust their action and desired outcome in response. Sometimes the player really does want to do what the game thought they were doing, but other times they’re attempting to do something completely separate from what the game is preventing.
Other user experiences provide a similar feedback loop. Users engage with the system and figure out how to accomplish the things they want to do with the system at hand. Common, positive actions need to be as frictionless as possible, but uncommon and potentially negative actions need not follow suit. By examining the context and usage of an application, developers can do a better job of predicting how users will map their intentions to the system’s functionality.
Asking for forgiveness
In some fields it’s easier overall to clean up after a user’s mistake than it is to prevent that mistake. Commerce-driven outfits like Amazon and eBay get out of the way of you giving them money as much as possible, especially if you’ve proved that you can give them money before.
Though this approach is cited as a great model for reducing friction, it can’t be universal. While you can refund a purchase, you can’t refund a data breach. If someone orders a product by mistake, it can be returned and the money refunded. If someone’s medical records are posted online, the damage to their privacy cannot be undone.
Once again, friction in a system can be used as a tool to slow things down at appropriate moments. We can use gray lists to engage the users, not only disclosing to the users what we think they’re doing but also what the consequences of continuing would be for them. This means that we can still allow actions that could be damaging — such as releasing medical records to a third party application — without making such a release so trivial as to be dangerous.
The right incentives
We should be building security into systems in such a way that the system can successfully interpret a user’s actions and determine their intention. Our systems need to get smarter about when they apply friction, using it as a tool instead of as a byproduct of the system. All too often, this friction is misapplied, and as a consequence it’s easy to be a criminal but hard to be a legitimate user.