Humane technologies, not mechanical monsters: designing apps that serve people

Christopher Cocchiarella
Vanguard UX
Published in
6 min readNov 2, 2022
Person using a mobile app
As UX professionals, we strive to create humane technologies in the best interest of end users. That means asking ethical questions about how apps collect and use people’s information to personalize the experience.

I’ve always believed that user experience (UX) and tech ethics are inseparable. If anything, the former requires the latter. You can’t create user-friendly designs with useful, usable content unless they’re truly built in the best interest of end users. So, when I joined Vanguard’s Client Experience and Digital (CXD) department this past year, I was delighted to be surrounded by a culture of UX professionals who share this conviction.

In addition to learning from many talented researchers and designers, this work experience has given me an opportunity to think more deeply about the intersection of UX and tech ethics. To illustrate, I’d like to share some thoughts on this topic, using mobile app design as an example. But before we get into mobile app design, let’s talk about monsters.

From mechanical monsters to humane technologies

In old myths, monsters took the form of deadly dragons or beasts. But in today’s stories, they often appear as machine-like menaces. Perhaps the most famous is Darth Vader, who, to quote Obi-Wan Kenobi, is “more machine now than man.” As popular sci-fi metaphors, mechanical monsters symbolize what happens when we lose control over technology. Instead of machines that serve people, we wind up with the opposite: people who become cogs in a machine.

Darth Vader
Photo by TOMMY VAN KESSEL on Unsplash

As UX professionals, we try to prevent that from happening by thinking about technology from the standpoint of human-centered design. In this way, we strive to create humane technologies by designing applications in the best interest of the people they’re meant to serve.

In mobile app design, for instance, we can ask ethical questions about how an application collects and uses people’s information or data. These questions may apply to web design as well; but in this article, we’ll focus on app design.

  • What personal information is the app collecting from users, and how is it using their private data to personalize the experience of the app?
  • Is that personalized experience truly designed in the best interest of those users?

Let’s address these questions by looking at two common, interrelated features in mobile app design: privacy and security features, and habit-forming features.

Privacy and security features

To function, apps may need to access personal information. For example, Vanguard’s mobile app is a financial application that allows people to invest their money and save for retirement and other goals. The app wouldn’t work without access to the user’s name, accounts, and other financial data.

Since the data may be confidential or personally identifiable information (PII), users need to be able to trust the app to protect their privacy and security:

  • Privacy: The app should respect people’s privacy by only collecting data relevant to the task or goal at hand. In the prior example, the Vanguard app collects data needed to provide its financial products and services.
  • Security: For any data the app does collect, it must include security measures to safeguard user information. Again, in the prior example, the Vanguard app employs security measures (including codes, keys, and alerts) to keep this information safe. Indeed, at Vanguard’s CXD department, security is the first “experience pillar” of mobile app design.
Security feature on mobile app
Security is the first “experience pillar” considered in mobile app design at Vanguard’s CXD.

However, it’s no secret that some apps out there may harvest private data for outside parties, possibly without clear consent from users. There’s a growing awareness of this problem on social media apps that collect our private data, offer outside parties access to this personal information, and target us with invasive ads or content.

This practice could put our privacy and security at risk, and it has spurred governments to pass new data protection laws (such as the European Union’s General Data Protection Regulation). At the same time, people are heightening their expectations when it comes to their privacy and security. Users should be able to trust apps to safeguard their private data and keep their identities secure.

From privacy and security to personalized experiences and habits

Users should also be able to easily understand what applications do with their personal information. That’s because the way an app uses their data — a.k.a. personalization — affects how they use the app.

This brings us to another feature in app design: personalizing the app experience in ways that lead users to form certain habits. The question is, will these habits be favorable nudges, or will they be adverse addictions, for end users?

Favorable vs. adverse habit-forming features

Many app features utilize personal information to help users form favorable habits, such as performing tasks linked to goals. To return to the previous example, the Vanguard app is designed to present users with a personalized experience of their financial information.

Even so, the point of this personalized experience isn’t just to enable users to invest their money. It’s also to help them invest their money wisely. It’s to help give investors, as Vanguard puts it, “the best chance for investment success.” Hence, personalizing the experience of an app can help users turn simple tasks, such as investing, into favorable habits linked to clear goals, like saving for retirement.

The Vanguard app creates a personalized experience to help give investors the best chance for investment success.

Some applications, however, promote habit-forming features that aren’t favorable to users. For instance, social media apps may contain addictive features like infinite scrolling — designed to keep users on the interface nonstop. By harvesting data about any and every aspect of their lives, this feature personalizes the app experience by targeting users with whatever ads or content will keep them scrolling … and scrolling … and scrolling …

This addictive feature can easily become adverse for users — there’s a reason we call it ‘doomscrolling’! At the very least, this type of personalized experience certainly doesn’t help anyone make progress on tasks linked to goals — especially goals in the best interest of end users.

Person addicted to scrolling on mobile phone
Photo by Pratik Gupta on Unsplash

While personalized experiences can help users form habits, those habits ought to be favorable nudges, not adverse addictions. In other words, apps should enable personalized experiences that help users form favorable habits, such as completing simple tasks linked to clear goals.

Apps that build trust and serve people

As UX professionals, we want to ensure our designs serve people. If our apps collect and use personal information, then we should make sure they follow some ethical rules of thumb. In this article, I’ve argued for a couple:

  • Apps must safeguard personal information to protect the privacy and security of users, and this information should be relevant for creating a personalized experience that helps people complete tasks.
  • The tasks that apps help users complete ought to align with (or at least not run contrary to) goals that are in the best interest of people using the app (such as enabling favorable habits, not adverse addictions).

Designing applications in this way is good for business, because it builds trust with our users. It’s also the right thing to do for the people our apps are designed to serve. Moreover, it helps ensure we’re designing humane technologies, as opposed to mechanical monsters.

Interested in joining the conversation or learning more about UX opportunities at Vanguard? Check out Vanguard careers in UX on our site.

*Please note that all investing is subject to risk, including the possible loss of money you invest.

--

--

Christopher Cocchiarella
Vanguard UX

UX Content Strategist, working to create humane tech through meaningful content and ethical design