Holding responsibility

Dan Sutch
CAST Writers
Published in
3 min readMay 30, 2024

One of the key parts of our experimentation canvas is considering what’s been described as ‘person in the loop’ — the role of a human when using GenAI.

We’ve experimented with creating our own CAST Sidekick — a ‘custom GPT’ — that is a version of ChatGPT that is trained on published CAST materials so that (in principle) anyone can (almost) have a member of the CAST team sat alongside them when exploring the role of digital, data and design in their work. It’s been designed to act as a supportive coach with training materials covering the sorts of work/experience that CAST has developed over the past nine years. The ‘person in the loop’ starts with the CAST team designing the Sidekick together, discussing the ways in which we work and look to provide support to charities. The second role is testing the responses from Sidekick until we’re confident it represents our experience and views.

Once we’ve completed these tests — there’s still a number of ‘person in the loop’ roles that we might use:

  • only using Sidekick to inform us, when we’re providing direct support to charities;
  • Sharing Sidekick with people we support, but still regularly checking in with those we work with to understand the interactions they’ve had, discussing them and course-correcting or reflecting on that experience
  • Sharing Sidekick with those we work with, but checking the responses to ensure it is accurate — playing a more hidden role, but one that ensures anyone interacting with Sidekick has an accurate and positive experience.

The reason for defining the ‘person in the loop’ is that we remain accountable and responsible for the interactions with Sidekick. If we do get to a point where Sidekick is open access to anyone wanting to develop their charity’s digital work, even though we won’t necessarily know who is engaging with it — we will still be responsible for that person’s engagement and learning.

As we develop digital tools, it’s quite easy to unintentionally outsource responsibility for important decisions or digital interactions — often to a digital partner without understanding how some decisions made within the development process influence the end-user’s experience. With GenAI it could be quite easy to outsource responsibility for the information or experience of your ‘users’ to those creating GenAI tools. But within human:human services we’d work hard to ensure new staff could represent the organisational approaches/experience/insight though onboarding, professional development and support — this is because the organisation (based on the governance model employed) holds the accountability for the interactions with that person. This accountability remains when we use GenAI. The accountability and responsibility remain with us.

Equally, we need to ensure we understand these tools well enough to hold those creating GenAI to account for the impact on the communities we support. That means intentionally looking for those implications and considering the role of our organisations in holding new power to account.

Cartoon showing a boy blasting a computer with a fazer gun, with the words ‘You vectorised something you shouldn’t have done, so I’m going to vapourise you with my fazer gun.’

We’ve put together a living library of AI resources that we hope will support you as you explore the challenges and opportunities associated with AI. Please take a look — and do let us know if there is anything you need particular support with.

AI transparency statement showing that the cartoon was generated using ChatGPTo, and that spell check and grammar check were used in this document

--

--