AI in action (treat with care)

David Scurr
CAST Writers
Published in
7 min readApr 18, 2024
David from CAST presenting at a workshop and pointing at a slide about ‘AI in the social sector’.

I recently took a week off, and in line with my New Year’s best intentions, I made sure to find some time to ‘play’. What began as a jam session on my drum kit ended up with me forming a new AI-powered band and experimenting with AI music generators like Udio and Suno (mind blown again). It was pretty harmless fun (check out this CAST 90s throwback dance anthem!) — a classic example of ‘Bring Your Own (BYO) AI’ into your home. This kind of BYO approach is not something you’d want in your workplace, however! But yet we’re seeing many instances of ‘shadow AI’ or ‘BYO AI’ in the workplace, which can carry significant risks. As a recent workshop participant put it: “it’s like the wild west in my organisation”!

No AI policy, no play?

We know from our recent AI survey that 73% of organisations have used or plan to use AI in their operations. This indicates a growing recognition of the benefits that AI can bring. However, despite this enthusiasm, there is a noticeable lack of support at the organisational level. Only 6% of respondents stated that their organisation had an AI policy in place, and over half reported receiving no AI-related training or support.

Infographic showing a summary of the ‘lows’ of the CAST AI survey results, namely a lack of policies, lack of research, lack of training and lack of support.
Infographic summarising the ‘lows’ of the CAST AI Survey results.

If this is the state of play, how can we instill confidence in charity staff to experiment with AI in a responsible and safe way, without having to resort to BYO AI?

This is a question we explored at a recent Charity Digital workshop with over thirty charity digital leads. Aside from being very excited to facilitate again in person, I was struck by how anxious and apprehensive some of the participants were towards AI.

This is partly due to the rapid adoption rate of AI technologies such as Chat GPT, which reached a user base of 100 million within just three months, and the transformative potential of AI. The combination of speed, impact and uncertainty can create overwhelm in teams. This gap can also lead to individuals taking their own initiative through ‘shadow-AI’ or ‘BYO AI’ which brings its own risks.

Two things we aimed to do in the workshop:

  • Demystify AI by showcasing simple use cases of AI in action — simple live demos of applications that would resonate with everyone
  • Empower organisations to experiment with AI in a safe manner — by facilitating discussions and open sharing as well as sharing tips and resources we’re finding helpful

AI in action: Three simple applications of AI creating day-to-day efficiencies

The power of AI lies in its ability to automate repetitive tasks, saving valuable time and resources. As tech pundit Benedict Evans put it, it’s akin to having an infinite pool of interns at your disposal, ready to take on mundane tasks and free up the staff to focus on more strategic or creative aspects of their work.

Why create efficiencies you might think? There are already some good cases of cost-saving efficiencies emerging in the third sector. At CAST, we’ve all been tasked withing 20% efficiencies in our work — once we’ve shared that with our colleagues, it’s so we can spend more time with our friends, families and communities. How might we reduce 20% of our input while delivering as much quality support? AI automation no doubt presents opportunities to meet this goal.

So let’s delve deeper into some simple and practical applications we’ve experienced at CAST.

1. Content and research synthesis

AI can be extremely valuable when it comes to synthesising content. For example, after an online workshop, there’s often a need to review and combine notes and post-its to distill key points. At CAST, we use Miro a lot as a whiteboarding tool and its new built-in AI feature, Miro Assist, can help with this by processing large amounts of content and extracting key themes and insights.

Things to look out for: it’s important to test the tool against the human-only process and share the results with participants for their perspectives. Our team compared notes with human-only results and Miro Assist, and both methods delivered really good outputs!

2. Accessing information for staff to respond quicker to queries

What if every charity and CAST partner had a member of the CAST team available 24/7 to coach them? We’ve been experimenting with Sidekick, a custom Chat GPT, to explore how we might do just that. It’s currently used by the CAST team to respond quicker to incoming queries from programme participants, and act as a coach. We think that by deploying AI-powered chatbots, we can enhance the efficiency of our support services and provide users with instant access to information and assistance. That’s what we’re testing out internally.

Check out Dan’s demo of Sidekick used in the Workshop below.

Things to look out for:

  • We’re trained the custom GPT so that answers are pulled from trusted and published data as a priority, e.g. CAST toolkits and blogs, and Catalyst resources.
  • We always compare AI-generated responses with human-generated ones to verify accuracy and relevance.
  • All content inputed is anonymised — we don’t use personal data and we’ll only use it internally.

3. Transcribing Meetings

(By the way, this was the use case workshop participants were most excited about!)

AI can be instrumental in saving significant time in meetings by taking over the task of note-taking. Tools like Fathom, Zoom AI Companion and Otter AI (there are lots more) can transcribe meetings in real-time, send AI summaries and suggest actions, freeing up the staff from the task of note-taking and sending follow-up notes. In a meeting-heavy sector, this simple application can save significant time, allowing staff to focus more on facilitation and maybe even finish their day on time! It’s not perfect, but as a participant reflected in the workshop, nor is note-taking done by a human person.

Things to look out for: if you’re testing a new transcribing tool, it’s crucial to check its data policy and ensure it complies with the organisation’s policy like you would for any other digital tool you adopt (this is not BYO AI, remember!). NCVO have some good guidelines for adopting new digital tools if in doubt. Additionally, it’s necessary to inform the meeting participants that an AI assistant is in use, to maintain transparency and uphold ethical standards. Finally, check the notes and summary and approve and tweak the actions (you’re still in charge!).

Treat AI with care

While AI offers many opportunities, it’s crucial to approach it with care.

First of all, we encourage those experimenting with AI to think about what level you’re exploring — whether it’s efficiencies within existing workflows, new processes and ways of working, changes to your mission, or context change. You might be worried about the longer-term structural changes AI is going to have on young people’s routes into employment, while also being excited about how your team can use AI to capture notes and to generate insights to inform new work. All are very valid — think about which level you want to explore. This oval-graph illustration below helps us identify which area we’re focused on:

An oval-shaped ‘work in progress’ AI framework designed by CAST which outlines four different areas of focus (from top to bottom): context change; changes to mission; new processes and ways of working; efficiencies with existing workflows.
A ‘work in progress’ CAST AI framework with four different areas of focus.

Also consider your organisation’s current pain points, processes that need improving and potential quick wins. When exploring solutions, we need to ensure that we’re going through good design principles and are focused on user needs (e.g. staff, volunteers, service users).

We’ve also been testing an AI Experiment Canvas to bring some consistency to our internal experiments and ensure we’re applying the same set of principles across the board. The canvas sets out what hypothesis we’re testing out, how we measure it’s success, which tools we’ll be using and how we’re mitigating risks and setting boundaries.

Below is a WIP example (work in progress) of the canvas (we’ll share a more user-friendly version of the AI Experiment Canvas soon):

A ‘work in progress’ CAST AI Experiment Canvas template.
CAST AI Experiment canvas template (WIP).

When employing AI in your organisation, it’s also good to keep some simple rules of thumb in mind (these are adapted from some helpful guidelines published by LOTI for local authorities and some guidelines shared by DataKind in a recent workshop):

  • Have a clear purpose: What problem are you trying to solve? Focus on user needs.
  • Set clear boundaries: Be clear about the boundaries of your experiments (e.g. we’ll only use our published documents; we won’t use personal data, we’ll only use it internally)
  • Data compliance: Abide by existing data policies. Check the data policies of the tools you’re using.
  • Confidentiality: Never share sensitive information.
  • Standardisation: Start with tools that are already adopted by your organisation to safeguard data and privacy.
  • Fact-checking: Check the content is factually correct.

A combination of these approaches and principles is giving us some framing and boundaries to experiment responsibly with AI. As is the case with most things AI-related, this is very much a work in progress and will evolve quickly.

Our journey into AI is very much a shared learning experience. The more we all share our experiments and learnings across civil society, the better equipped we’ll all become to navigate the opportunities and challenges that AI presents.

We’d love to hear how you’re getting on with AI applications in your organisation. If you’re interested in sharing and learning together to expand our collective understanding, why not get involved in one of our AI peer groups:

And last but not least, remember to also send us your AI-generated dance hits please— other genres very welcome too :)

--

--

David Scurr
CAST Writers

Passionate about tech for good & community building / Programme Lead at CAST / Founder, Tech for Good Brighton / Founding Member, Tech for Good UK/ @david_scurr