Ethics and app design
When is designing addictive apps inhumane? Is it ruining our ability to make nuanced decisions?
Friends of mine found an interesting article about how our minds are being hijacked by modern technology to become addicted to apps. They asked me what a designers perspective on it was. It was fascinating to see their responses to things that any UX Designer knows: that designers are deliberately trying to create addictive apps using, amongst other thing, Nir Eyal’s ‘Hook Model’.
Designing addictive apps
I came across the Hook Model during my UX design course at General Assembly and remember being shocked at the power and deliberateness of designing for addiction. I’ll admit that designing, as I was at the time, an app that had a low percentage of returning customers, I was keen to learn all I could about how to make it just a touch more addictive. Retention was what we needed and employing some of the techniques made popular by Nir Eyal might be the answer.
I’m happy to say we did in fact abandon that product and develop something with far more user input: the aim of good UX design being to help people achieve their aims, not to make money sucking people into an unproductive time vortex of superficial ‘rewards’.
I can understand the general public being uncomfortable with how ‘easily’ they can be manipulated by a designer who deliberately learns specific elements of their psychology so they can trick you into spending unplanned and unwanted hours of time trawling though YouTube videos.
The easily is in inverted commas because, as someone’s who’s actually tried to design an app with sufficient retention to make it a viable business proposition, it ain’t easy! It has however been done, by the right-place-right-time individuals who, for example, created the Facebook ‘like’ button (Justin Rosenstein) or the swipe down to refresh function (Loren Brichter). And their success is of course being built upon by designers fully aware of the Hook model to increase the amount of time you interact with Fb, YouTube, Instagram and Twitter because that’s their revenue model.
One of the strengths of the article was that it made a strong link between those that have originally designed what turned out to be the most addictive elements of the apps we all use, and those now blowing the whistle on their own industry and calling bullshit. These are now the people limiting the hours the internet is available in their house, greyscaling their phones to reduce how enticing the apps are and actually deleting apps off their home screen so in order to use them they have to type and search for it.
Take that designers, do you think you’ve gone to far yet? We actually have to subvert your design decisions to be able to use technology in a humane way. (I am in no way blaming you, I no doubt would have been honoured to do the same in your place, I’m just pointing out that now the insidious negatives are being so clearly seen, there might be ways you could use your prodigious skills elsewhere to benefit this wonderful world :-) ).
The Center for Humane Technology, created by tech insiders, has a simple list of things, including those above, you can do to help limit the addictive qualities of modern apps. It also calls for political pressure to be applied on tech companies and designers to design more compassionately for human vunerability.
End of App Mania
It make me reflect for the first time on the end of app mania. The tech startup world as I’ve experienced it is still basically focusing on making apps. Now, clearly there is still some milage in creating innovative apps. Many tech experiences are far from the seamless, pain-free, even joyful experience they could be, so clearly there is work to be done.
However, perhaps the hubris that attempts to solve the problem of human connectivity with an app is over. The single minded focus of solving a problem with an app, without considering all the knock on effects, finished. When the results are so contary from the professed aim, it’s time to start wondering if you’re using the right tool.
One of the things that clearly came across in the article is we need to be treating app addiction in a similar way that we treat gambling or an addiction to high calorie food. It’s dangerous, insidious and is causing serious societal challenges.
Designers love new stuff. Just like our fellow designers who thought plastics would solve all the worlds environmental problems when they were first invented, we jump on new technology with an exuberant enthusiasm that, in retrospect, often seems misplaced. Now is the time to step back and realise the negative implications and think about the whole system and context we’re designing in.
Want another analogy? Everyone got addicted to nicotine when it was new and we’ve all got addicted to apps in the 2010s. It’s time we realised the health and wellbeing travesty it is and quit.
The counter argument
There is without doubt a case to be made for learning speific aspects of human psychology and applying them to help people to get the benefit out of a product. For example, looking through the Helix Design Centre projects, I was struck by their work to improve participation on bowel cancer home screening tests.
Making that process a little more ‘addictive’ — or at least decreasing the activation energy so that people actually do it, seems an eminantly valid use of persuasive design. Building in a positive feeling into that so bowel cancer can be treated effectively. Using what we’ve learnt from highly successful human motivation and psychology projects to improve vital services.
However, the problem is that with all the big addictive apps of the day, the ad revenue model distorts the design into something wholly more sinister. The Facebook like isn’t inherently a bad thing, in the same way that nicotine is. But the working that go on behind and try to sell
If you’ve never made a Facebook ad, I’d encourage you to try it out (but not publish!) I’ve often thought if people knew how much data Facebook scraps from you and how accurately you can target adverts, they would realise they’re not the customers of Facebook, they’re the product.
This article, and others like it, hopefully signal the end of the era of having our data harvested more or less without our consent to create trillions of pounds. It only leads to companies motivated to make the sole aim of their product to capture our attention. Literally capture it like a poacher and sell it to the highest bidder.
Political click bait
Perhaps the most imporant connection the article made was between political sensationalism and attention grabbing phone apps. Looking at some of the big, unexpected political moves in recent years: Trump, Brexit, Corbyn and Sanders, it’s easy to see how the click bait five-second-attention-span articles sporn political sensationalism. The sensational and simplified is all people have time for scrolling through a Twitter newsfeed. It has resulted in events that certainly succeed in grabbing our attention but don’t seem to be moving in a direction of honestly improving anyone’s lives.
Which makes total sense when you consider the conditions we need to make informed choices on complex issues. We need time and space to connect to ourselves and allow the nuances to play around in our head until an intuitive sense of what is important to ourselves and those around us arises. We need time alone. We need time to discuss things over and get feedback from friends.
What we don’t need is a distraction every 5 minutes. People with young children often comment it’s impossible to get anything done because you’re being interrupted so often you can’t get stuck into a task. It strikes me that constantly messaging or being sucked into a Facebook vortex would have a similar effect you just don’t get to have children at the end.
Our love of social media reminds me of what Mr. Weasley said to Ginny in ‘Harry Potter and the Chamber of Secrets’:
“Never trust anything that can think for itself, if you can’t see where it keeps its brain.”
For those that haven’t read it, Ginny finds an old diary and begins writing all her hopes and fears in it, not realising that it actually contains part of the soul of the darkest wizard of all time, Voldemort. He listens and sympathises with her, before starting to possess her and make her, for example, slaughter chickens and paint blood on the walls.
How like Ginny are we? Confiding our innermost thoughts to to a seemingly harmless screen, unaware that a megalomaniac comapany is at the other end using that data to manipulate us? Do we also have large periods of our life we don’t have full control of what we’re doing, waking up with a vague sense of unease after binge watching Netflix?
Animism in app design
I found a fascinating, new and relevant way of looking at this problem listening to ‘Magic and the Machine’ by David Abram. He makes the point that our deep need to find connection in the apps we use, a major underlying motivation behind all social media, stems from our instinsic understanding that inanimate objects are animate and have something to tell us.
Developed when we lived deeply within nature, it was a survival benefit to develop the focus and understanding that listening to the forest could bring. Stripped out of that we find excitement and possibility in the alive-ness of our notifications and the voice of Siri but will be ultimately disappointed as we are merely connecting within the realm of human experience, not the deep wisdom of otherness found in rocks and rivers.
If this all sounds totally crazy, I’d really recommend taking the time to listen to it in his words before designing your next voice UI.