Why trust is more important than good UX

Sarah Gold
Writing by IF
Published in
5 min readMay 31, 2023

We all want trustworthy relationships — we need them too. We need to be able to trust the people around us in order to work, learn, play and live well. We make trust decisions every day about which relationships to nurture or avoid. The same applies to services, brands and organisations.

We need an urgent gear-change now

Organisations are using more advanced and data-intensive technologies to provide their services. The problem is the pace of technology deployment is not matched with the design or implementation of enablers of trust; the delta between trust and technologies like ChatGPT is widening, and fast.

The cost of untrustworthy products? Fines and forfeiting long-term customer retention, growth opportunities and talent retention… to name a few

Trust gaps directly shape people’s decisions about which relationships they’ll start, keep or avoid. Trust gaps cost brands more to service their existing customer base.

Brands choosing not to work on their trustworthiness are also being challenged with higher regulatory fines. That puts services, brands and organisations at risk of damage from their trust deficit. And it’s costing their staff too, with 34% of business leaders feeling morally uncomfortable with how their organisation uses data on a weekly basis.

Trust is earnt (and lost) across all interactions with your brand, services, or organisation

Services, brands and organisations earn and maintain trust with their customers and partners by demonstrating their trustworthiness over time. That’s why addressing trust is urgent and important work. Because just like in our relationships, there are no silver bullets to quickly earn or repair trust.

Trustworthy interactions place meaningful control back into the hands of users

Closing the trust gap creates opportunities to meet new, emerging user needs for trust. Helping users know if they are communicating with a Chatbot or a person, for example. That’s important! Today, there is a growing number of ethically minded consumers who believe the customer relationship extends beyond transactions. Helping users helps you to acquire new customers, and maximise and retain the value of each customer over time.

We have started to define specific enablers of trustworthy data-driven services—so that others can apply and shape the thinking.

A close up photograph of a paper prototype. The prototype is of an annotated swebsite screen. It shows a patient how they can read their hospital discharge letter and access the data that was used while they were in hospital.
Annotated paper wireframe showing a hospital discharge screen. It shows a verifiable data history of what happened to the patient in hospital, putting transparency at point of need for a user.
A low fidelity prototype of a phone screen with a prompt from a human and a response from a chatbot. The chat bot is called Moodjar and it asks the human if it has correctly understood their feelings.
Prototyped mental health chatbot interaction, created in collaboration with COMUZI, which empowers users to course-correct the AI in-context. Designing change-enabling feedback loops, between technology and users, is a key component of meaningful user control.

Today there is a real opportunity to influence lawmakers and define the future of policy so that it works for users. For example, Generative AI content creates multiple new trust risks for users — and therefore businesses — that policy is already starting to address. GenAI makes it harder for customers to know what content to trust. But customers’ expectations of trustworthy brands are high, and they expect responsible solutions. Solutions like using watermarking and verification techniques are possible, but technical complexities and some user motivations mean 100% accuracy is not possible. Instead it takes understanding the technical feasibility and user needs, and how these influence each other over time to create meaningful solutions. Getting this right early means you’ll see around the corners what others can’t.

Counterfactual explanations for automated insurance quotes provide in-context transparency around how prices were calculated, without overwhelming the user.

Trustworthy organisations are better positioned to innovate at pace

In 2019 we worked with a healthcare insurance company looking to use a new disease-predicting AI model to provide more proactive care to a wider range of people. But the trust and ethical implications of using this technology were complex. We helped them develop a framework, process, risk register and Ethics Council — before their competitors. This laid the foundations for them to innovate responsibly, changing their cost model so that it was easier for them to do the right thing and meet growth targets. Solving these challenges early saved them $50M in their first year, and helped them to roll out new models faster and more responsibly than competitors.

Trustworthy brands are better positioned to grow

Organisations that have been addressing their trust gap are able to operate in new markets, and can more easily diversify their portfolio. Apple is a great example of this. From continued marketing campaigns of “Privacy. That’s iPhone”, to re-building Maps in 2018 to be more trustworthy, and their commitment to encryption they have consistently invested in their customers’ trust. And now they have opened a bank, which saw over $1B deposited in savings in its first 4 days. Trust underpinned this.

At IF, we specialise in designing for trust across the stack

We have been working at altitude on complex trust challenges, helping clients from Big Tech to Governments make trust actionable. Because we know that trust takes more than UI changes, we identify high-leverage points across policy, organisational capabilities, technical architecture and user experience — what we call the ‘full stack’. We then work with each of those components, which together move them from zero to trusted. For example, last year we helped an engineering team create backend services that are used to reduce major data breaches worldwide.

This includes providing a vital bridge between policy and product

Regulation is impacting product teams more than ever. We have extensive experience in the policy space and have helped teams anticipate and design for incoming requirements for 7+ years. As a trusted third-party, we also have found that our well-respected reputation in these spheres has facilitated constructive conversations. For example, when we worked with an AI-organisation that had undergone a trust breach, we liaised with the regulator and civil society organisations to address their concerns and evidence change. This gave external parties the confidence that trust was being repaired internally.

What we hear from clients is that IF’s brand has equity. The work we share has weight because of what we stand for, and our reputation in this space. Work we delivered on trustworthiness with generative AI in February, continues to be used and built on internally, in part, because it was made by us.

Trust isn’t a new problem, but we are living in an era where trust will collapse unless we act. Trust takes years to earn, seconds to break, and forever to repair.

Acknowledgements: This blog post has been co-written and edited with Imogen Meborn-Hubbard, with the help of the team at IF.

Ready to act? Know someone else we should be speaking to? Get in touch at hello@projectsbyif.com.

--

--

Sarah Gold
Writing by IF

Designing for trust. Founding partner and CEO @projectsbyif