CAST’s AI survey: what we’ve learned, what we’re doing — and how you can get involved

CAST
CAST Writers
Published in
13 min readMar 12, 2024
A simple infographic summarising the statistics given in the main text, with four sections: A positive mindset; Organisational benefits; High levels of AI adoption; Common usages

We’ve spent lots of time recently CAST experimenting with AI, and considering how, in such complex and fast-changing conditions, we can best design support aligned to sector needs. We’ve gleaned valuable information and inspiration from the work of Zoe Amar, Torchbox, The Civic AI Observatory, Charity Excellence, Promising Trouble and SCVO, amongst others.

To build on these insights, we developed a survey designed to answer some of the specific questions we were asking about how we at CAST could best respond to current needs — both directly and in terms of helping to influence response and action across the sector.

In this blog, we’re going to delve into the results of the survey, looking not just at the needs and behaviours revealed by the data, but also at the fascinating themes that emerged from individuals’ responses to open questions. We’ll also lay out our plans for support interventions, with calls out to charities and other social impact organisations, as well as funders, partners and the wider community for input and collaboration as we move forward.

The survey: who responded?

Between 13th December 2023 and 7th February 2024, the survey received 164 unique responses. The majority of respondents (59%) worked for large or medium charities, and most (75%) worked for organisations with 1–100 employees. The most commonly cited job functions were (in no particular order) CEO; Director; Comms; Digital Lead / Manager; Fundraiser; Head of Digital and Product, Project or Programme Manager. 79% of respondents stated that their organisation supported at least one marginalised group. The majority of respondents reported that their organisation had at least some foundational digital knowledge, with only 2% stating that they were struggling to use the basics.

What did we learn?

If we had to summarise the results of the survey in one sentence, it would be: There’s a healthy appetite for — and attitude towards — AI from an individual perspective, but this is set against a pronounced lack of support on an organisational and wider sector level.

Let’s dive in to the detail, starting with the quantitative story, as told by the stats:

What did the stats tell us?

The highs…

A simple infographic summarising the statistics given in the main text, with four sections: A positive mindset; Organisational benefits; High levels of AI adoption; Common usages

In terms of attitudes, perceptions and engagement with AI, the picture was remarkably positive: when asked to map their feelings about the increased use of AI in their professional life (on a scale where 1 signified ‘very concerned’, and 10 ‘very excited’), respondents averaged 6.4. And in a parallel statistic, 64% expressed the feeling that AI would be beneficial for their organisation — with just 2% perceiving it as ‘highly challenging’.

Perhaps unsurprisingly, given these perceptions and attitudes, the survey revealed high levels of AI adoption and a keen appetite for experimentation. 53% of respondents had already used AI in their organisation’s operations or projects, and a further 20% were planning to do so. The most common usages were content and copy based: creating or editing content, transcribing or summarising copy, and checking spelling and / or grammar.

The lows…

A simple infographic summarising the statistics given in the main text, with four sections: Lack of policies; Lack of research; Lack of training; Lack of support

However, it was clear that individuals’ personal appetite for AI experimentation was not matched by support on a wider organisational level. Just 6% of respondents stated that their organisation already had an AI policy in place (although 25% were in the process of formulating one). And only 20% stated that their organisation was actively involved in conversations and / or policy-making regarding ethics in AI. It was also interesting to note that, whilst the majority of respondents had already used AI within their organisation, only 5.5% stated that they were collecting data on the impact of AI on their beneficiaries.

This lack of wider organisational conversations and actions around AI seemed to be reflected within the approach to staff development: 51% of respondents stated that their organisation was providing no AI training and support whatsoever. And when asked about support available to the sector, just 6% of respondents expressed the feeling there is sufficient support available with regard to upskilling on AI related developments and tools.

The needs…

Given the apparent lack of support structures, it’s unsurprising that only 1% of respondents stated that their organisation faced no challenges in adopting AI — and when asked to think about support needed to use AI within a work context, just 2% stated that they needed no help at all. Indeed, each of the eight skills-based support options within the survey was selected by more than half of respondents, with the top two being: Identifying how AI can help create efficiencies in our day-to-day operations and Developing and implementing the right policies and guidelines for AI use. Take a look at the full breakdown below:

A bar graph showing responses to the question: Thinking about the use of AI in your work, which of these do you feel you need support with?

As well as the ‘what’ in terms of skills and knowledge, our survey also investigated the ‘how’ — i.e. which types of support would be most useful. Again, there was a significant appetite expressed for all strands, with each of the six options selected by more than 60% of respondents. The two most popular support types were: Taking part in AI training workshops and Access to peers within the sector to discuss AI. Take a look at the full breakdown below:

Bar graph showing the percentage of people who answered ‘important’ or ‘very important’ to the question: How important is it for you to receive the following type of support to explore and implement AI effectively?

What did the stories tell us?

The statistics paint a fairly clear picture, but we also asked three open questions, all of which were optional — and the levels of response were interesting to note. Of 164 respondents, 57 answered the ‘other comments’ question, 79 the ‘funder support’ question — and a remarkable 124 the ‘magic wand’ question. This in itself seems to suggest that people are very keen to share their opinions, needs and concerns — and to reach out for support.

In something of a meta moment, we actually experimented with using AI (specifically: Notion AI) to synthesise the themes from these AI survey questions! We did then go through and edit, but Notion AI was probably 80% correct, and saved us a fair bit of time.

We’ve pulled out some key themes and quotes for each of the questions as below; take a look at our website for more detailed themes.

Key themes that emerged from responses to the question ‘Can you think of any ways that you’d like funders to support you to use AI in your organisation?’ included:

  • Funding support for AI experimentation, skill development, and pilot projects
  • Training and workshops to help organisations understand and implement AI
  • Guidance and resources on ethical considerations and mitigating bias in AI
  • Grants and funding for staff capacity and resource development
  • Collaboration and sharing of expertise within the sector
  • Redesigning funding applications to accommodate AI projects

Individual responses to this question included:

“Be clear about how funders use AI when assessing applications and whether they accept grant applications where AI has been used or not.”

“Provide funding for infrastructure bodies to develop free resources to enable charities to understand which tools are helpful/ unhelpful and the ethical and legal considerations.”

“Pay for time spent researching, especially user research and considering its place within our strategy. Pay for headspace time!”

Key themes that emerged from responses to the question ‘Imagine you had a magic wand. You can determine what happens with AI in the next 12 months. What do you wish for?’ included:

  • Availability of — and accessibility to — AI tools and solutions for all
  • Clearer policies, standards, and guidelines for AI use
  • Ethical guidelines and responsible use of AI in the charity sector
  • Reduction of bias and inclusion of diverse voices
  • Ensuring small charities have access to AI solutions
  • Unified sector collaboration, shared experiments, and funding for ethical AI use

Individual responses to this question included:

“That small charities aren’t left behind. More than most, we need AI solutions that automate tasks and free up resource so we can dedicate our limited resources on frontline work.”

“Humanity and kindness is programmed in to all tools, helping to remove propaganda and fake news across social media platform, and identifying and removing harmful content.”

“Decentralised AI tech and models, not just owned by a handful of large, powerful private companies. Models and datasets include multiple voices, viewpoints and languages to reduce bias and reinforcement of existing power structures. AI for all.”

And lastly, key themes that emerged from responses to the open call for any additional opinions or needs with regards to AI included:

  • Concerns about the digital divide and exclusion of those without tech skills / access
  • Concerns about ethical implications, biases, and exploitation risks
  • Challenges in understanding and navigating the possibilities and concerns surrounding AI, particularly for small charities with limited resources.
  • Calls for collaboration, shared learning, and resources within the charity sector
  • The importance of regulation, privacy, and establishing ethical standards for the use of AI in the sector.
  • Challenges in funding applications and distinguishing truth when AI is used.

Individual responses to this question included:

“We need faster policy decisions, and a list of providers to be shared so you know where to go…to help weed through the layers of ‘experts’.”

“I worry about the ethics and that it will be used to exploit vulnerable people. It is unregulated and there is no need to say if you used AI or not.”

“We need to consider the sector as a whole including funders’ use of AI and how it will impact their grants processes, in order that applicants can better respond.”

Clear needs, challenges and concerns emerging from the quantitative and qualitative data taken as a whole, are the need for practical training and support; concerns about ethics and bias; calls for funding support and funder clarity over AI use; concerns about smaller charities being left behind; the need for clear policies and guidelines, and concerns about misuse and misinformation. However, these are set against a clear appetite for adopting AI — and indeed, calls for collaboration and shared experimentation amongst sector organisations. So the question is: how can we best support this?

What are we doing?

We plan to use the survey insights to formulate some practical support activities — as well as to help influence the response across the wider sector. As ever, we’ll be working in the open as our plans take shape — but at this early stage, we’re laying out our intentions as below.

As part of this, we’re putting out some calls for information to help us put together as robust and representative a support offer as possible, whilst minimising duplication and prioritising collaboration and reuse. If you can help with any of the requests for information as below (in bold), please email hello@wearecast.org.uk — thank you!

AI support resources

  • In collaboration with our steering group, we’ll be adding an AI section into the Digital Skills Framework. We will also look to add an AI section into the CAST Digital Toolkit.
  • We’re keen to connect with other organisations that are providing free AI support and resources for charities and other social impact organisations. Ideally, we’d love to create a network where organisations signpost to each other’s resources, so that wherever someone ‘lands’ when looking for help, they have centralised access to all other available resources and directories. We’re already aware of the excellent resources being provided by The Civic AI Observatory, Torchbox, Charity Excellence, CharityComms, Promising Trouble, Zoe Amar and SCVO, and we will signpost to these via our platforms and toolkits. Get involved: if you are aware of any useful AI resources that are available free of charge for the charity sector, please do send us links and we will look to signpost to them — thank you!
  • We’re also considering building out some AI experiment principles (along the lines of the Better Digital Services design principles). Get involved: do you know of anyone already working on something similar, with whom we could potentially collaborate?
  • Although the survey was anonymous, we did offer respondents the option of adding their contact details if they were keen to discuss the topic further. Over the next few weeks, we’ll be reaching out to those individuals that added their details — and we’re particularly keen to speak to anyone that can help ‘fill in the gaps’ that have emerged from the survey — e.g. anyone that does have an organisational AI policy they are happy to share, or anyone whose organisation is conducting research into how AI is affecting beneficiaries. Naturally, if we receive permission to make any information, materials or resources open, we will share across our channels. Get involved: if, when you read through the above results, you felt that you (or your organisation) had resources, materials or other information that could be used to help meet the needs or mitigate the challenges expressed, we’d love to hear from you!

Events / workshops / peer group meetings

  • Next week (21st March), we’ll be presenting a workshop entitled ‘AI in action — Making it work for you’ at Charity Digital’s ‘Transform and Thrive in 2024’ event. Drawing on results from the survey, the workshop is designed to provide senior management from medium and large charities with the practical tools, inspiration, and confidence to experiment ethically with AI within their own organisations — and to encourage peer-to-peer sharing and learning. We will of course share any key outputs and resources openly, on Medium and our social channels.
  • We recently started convening an AI in Grantmaking peer group, which brings together funders to discuss the opportunities and challenges presented by AI — not just within Trusts and Foundations’ own practices, but within civil society organisations too. The group — which currently comprises 50 Trusts and Foundations — is open to any UK funder of any size; to find out more and sign up to be part of the April meeting, take a look at the AI in Grantmaking section of the CAST website.
  • We’re currently working with Deloitte’s AI Institute on an AI mini-series for 5MillionFuture charity partners, the current Deloitte Digital Connect cohort and members of CAST’s Digital Leads Network. The series will cover many of the topics that have been raised as priorities in this survey, namely ethical, bias and privacy concerns; strategies for developing effective AI policies and procedures, and best practices for responsible AI use — as well as sharing stories from charities that have successfully incorporated AI into their operations. We will aim to make as much of this information as possible available in the open.
  • We have an in-person gathering of the Digital Leads Network (DLN) coming up later this month, and AI is high on the agenda, particularly in relation to peer group connections and networking. Once we have agreed what shape this might take for the DLN, we will look to either open this out or replicate the model for the sector. Get involved: If you’re a charity digital lead interested in meeting peers to network, share best practice and learn from each other, please join us at the DLN in-person meetup on 28th March.
  • We’re looking into the possibility of convening an online ‘Festival of Learning’, whereby charities share the practical ways in which they’ve used AI, which could be feasibly replicated by other organisations — and share useful resources such as policy templates. Get involved: do you know of anyone already working on something similar, with whom we could potentially collaborate?

Ongoing research and support

  • We’re already in conversation with a number of organisations about how to share research results and collaborate on support interventions — and we’d love to connect with any others! We’re keen to understand where survey / research results might be aligning (or diverging) — and what that means in terms of any patterns developing or needs evolving. And naturally we’re keen to collaborate on support interventions where possible, in order to reinforce quality and minimise duplication. Get involved: if you know of or have been involved in any other such surveys or support interventions, please let us know. Or, if you’re looking to run a survey and would like to use some or all of the questions we asked, we’ll happily share them with you: please just get in touch! And if you’re developing any support interventions that you feel could benefit from our input, please do get in touch; we’d love to chat about any collaboration opportunities.

Ongoing experimentation

  • We’re currently running a number of AI experiments within the CAST team — both across specific projects (e.g. using AI to conduct user research synthesis for Design Hops) and also more widely (e.g. exploring the potential of ChatGPT to act as an advisor on digital best practice for charities). We’re also in the process of designing an AI experimentation canvas, which we will share as an open access resource for the sector when complete. We’re gathering data across each of our experiments, and will be making any useful results, recommendations and resources available to the sector in due course.

Please do let us know which elements of the above feel most interesting / useful to you — and what, if anything, you feel is missing from these initial support plans.

Closing remarks

We hope this has been a useful round-up of our survey findings and planned next steps. Please do feel free to quote our survey in any work you might be doing around AI; if you could attribute it to CAST and ideally include a link to this blog piece for full clarity, that would be much appreciated!

Look out for updates over the coming months as we work through our initial plans, and seek to provide support both via our own efforts and by collaborating with others. And as ever, please get in touch via hello@wearecast.org.uk with any questions or suggestions — thank you!

An observation: Respondents are likely to have seen the survey on a platform that has some focus on digital, whether that’s CAST or Catalyst’s channels, or those of a funder or support organisation. This may mean that they index within the more digitally keen and confident. Indeed, only 3% of respondents stated that they had no prior knowledge of AI. This may be worth keeping in mind — however, it’s also worth noting that even this potentially more ‘digitally able’ group expressed significant concerns and support needs.

Article written by Sonya Hayden, Head of Communications for CAST

--

--

CAST
CAST Writers

The Centre for Acceleration of Social Technology — upskilling and upscaling social sector organisations to use technology for accelerated social change.