Updating the Responsible Technology by Design framework

Sarah Gold
Writing by IF
Published in
5 min readJun 25, 2024

Our Responsible Technology by Design framework describes the properties of trustworthiness in digital products and services, and how those properties can be changed. We have updated the framework, and I am delighted to share it with you.

First, let’s unpack what we mean by trustworthiness in digital products and services…

Trust is essential, and technology is making it more complex

Trust is at the core of every relationship. But, it’s hard to define, control or measure. It is a feeling that changes over time.

Today, there are more people and organisations to trust than ever before. Many of those organisations are choosing to deliver services using technology, from text messages through to AI. Few of us know how these technologies work. And digital products and services are still inherently opaque.

From trust -> trustworthiness

Trustworthiness is the act of consistently demonstrating that you are worth trusting. It is something we can practically change and measure within digital products and services.

And change is needed at the product level to meet the demands of a growing and shifting regulatory landscape. It’s possible to design products and services that embody trustworthiness.

At IF we’ve been working on this challenge for many years, with other leaders in this field. It’s from this work that we have learnt that trustworthy products and services are those that help people feel empowered, safe and respected. Trustworthiness needs to be experienced by those people who come into contact with the product or service.

As we like to say at IF, trust is the new experience.

Text on a pink background. TRUSTWORTHY EMPOWERED SAFE RESPECTED

Three caveats about this work

  1. Digital products and services that are trustworthy are an emergent phenomena. As such, these are our ideas that we hold loosely. Similarly the definitions and terminology are fuzzy and subject to change based on context so, as we do, challenge them and adapt them to your situation.
  2. Whilst we are focussed on what makes digital experiences trustworthy, we’re under no illusions: technology isn’t a magical solution to trust. We believe that when deployed thoughtfully, technology has the potential to include vastly more people and create better outcomes at scale. That’s a good thing.
  3. Lastly, and most importantly, the goal of trustworthy products and services is better adoption. Trust makes it easier for organisations to meet people’s needs. Another good thing.

What are the properties of trustworthiness?

Our Responsible Technology by Design framework describes the properties of trustworthiness as transparency, accountability and participation.

Pink background with the responsible technology by design framework in black.

These properties overlap with one another, and can be combined, to create a product or service that is trustworthy. That’s one that helps people feel safe, empowered and respected. Where the service supports them to make informed choices that are right for them, in their context.

How do they manifest in products and services?

The properties manifest in different ways within the layers that make up a product or service. Issues in one layer might only become visible elsewhere, or require resolution in another layer.

We deliberately organise these layers into UX/UI, technology and data, policy, organisation and society. Each of the layers have an impact on the user experience, and trust.

Pink background with the following text: layers, UX/UI, tech & data, policy, organisation, society

Transparent

Definition

Make it explicit how technology and data are being used and how decisions are made at every layer of the technology stack — and how this changes over time.

At IF we think that transparency should be meaningful to the people who need it. Practically, this means using transparency as an input for user experiences, where the interface becomes self revealing. Our research tells us that transparency is most effective when claims are provable.

Examples of how transparency can be made proveable:

A screen showing task completed yesterday, an arrow leads to another screen saying Task completed on 11/10/2026 by automated service BestBot with an option to Get digital proof
Progressive disclosure is a design pattern that enables an interface to be self revealing without becoming overwhelming or cluttered.

Accountable

Definition

People are assured that organisations keep the promises they make and that there will be consequences if they don’t.

At IF we think of accountability not only as compliance to legal requirements like consent or governance, but also as smaller promises that enable people and communities to spot when promises are being broken and to create change, fix them or get redress. It’s also useful to keep in mind that the perception that something is broken, wrong or being misused can be just as powerful in eroding trust as something actually being broken, wrong or misused.

Examples of how accountability can translate into product:

  • Performance information published publicly
  • Human centred APIs
  • Use digital proofs to provide records of promises
Button to query a payment.
Query an event is a design pattern that helps someone understand why an event happened and to get help if something doesn’t look right.

Participatory

Definition

Individuals, multiple people and/ or communities are included, involved and supported to engage in, debate and influence the equity or outcomes of a product or service.

At IF we think that participation needs to go beyond public participation, involvement and engagement programmes. Whilst necessary, they are insufficient. We need many different kinds of participation that are capable of moving at different paces and scales. For example, people need ongoing participation methods, particularly for more complex services, where their expectations, or the technology, will likely change. Or, when we apply participation to experiences that are enabled by AI, participation can mean co-piloting, steerability and recovery of AI systems.

Examples of participation within product:

  • Counterfactual explanations
  • Equality reporting
  • Feedback features for model improvements
A dialogue showing an AI system reporting ”Unable to complete task automatically” with an option for the user to takeover.
Step and takeover is a design pattern that gives a person the ability to take over a task from an AI system, steering towards a particular outcome.

Peoples’ expectations are rising

People have high expectations of organisations, and they expect trustworthy solutions across the private and public sector. For example, we know that many people expect technology to understand them and anticipate their needs, but it cannot be at the price of privacy. Understanding how to best deliver user value whilst demonstrating trustworthiness is more and more important.

Translating trust into product creates an advantage

Earlier this month Apple announced “Apple Intelligence”. In their marketing they speak to privacy, but unlike many other companies they also translate this provably into their product and underlying technology. These moves are deliberate and comprehensive. And where Apple leads, the industry usually follows.

The organisations that understand how trust creates a competitive edge have the advantage.

Get in touch

We work with private and public sector clients around the world on some of the most challenging implementations of trustworthiness in products and services, and have done so since 2016. Our speciality is making trust actionable, transforming principles into execution that accelerates adoption.

Do get in touch if you want to understand more about our work and how we can help.

--

--

Sarah Gold
Writing by IF

Designing for trust. Founding partner and CEO @projectsbyif