Influencing AI

Dan Sutch
CAST Writers
Published in
5 min readNov 3, 2023

I was recently invited to speak on a panel discussing AI and civil Society. The particular question we were asked to consider is ‘how could civil society shape the future of tech and AI?’.

The question is closely aligned with our work at CAST. Our strategic mission is to accelerate the agency, presence and influence of social impact organisations on the technologies that affect us all. For us, much of the influence comes from having the agency and being present in technology: the skills, experience, understanding of using and creating digital technologies, which when aligned with the charitable purpose of these organisations gives us a unique view and role in shaping technology.

The session was quite short so the points below are succinct — but they’re the ones that seem most important at the moment in addressing this question.

So, how can civil society influence digital and AI?

Let’s be realistic

Using 360Giving’s Grant Nav tool we can explore how much funding has gone to charities’ digital work. The trends over the last few years are pretty stark — there’s a really significant downward drop in funding going to support charities to develop digital projects, skills and experience. Looking at the last full year 2022, £28 million was given to charities for digital work (excluding grants for bio/natural sciences). That’s a large sum of money but it is only about 0.5% of the funding captured in 360Giving for that year. In the same year global corporate investment was almost $92 billion (£76 billion). The variation there is that UK funding for civil society digital work is not even one-third of one percent of the funding going to develop technology elsewhere. In that context we need to be realistic — we will not compete or have a loud voice in the development of AI unless we change our priorities and make better collective use of any funding for charity digital.

I appreciate, of course, that civil society’s influence isn’t only from direct funding; that civil society has influence in a number of different ways — but these figures begin to set the context for civil society organisation’s opportunities to experiment, learn and become more influential in this space.

screen shot from 360 Giving’s GrantNav website showing the number and amount of grants going to digital projects for charities. The trend shows a decrease in the amount of funding going to digital projects for charities.
screen shot from 360 Giving’s GrantNav website showing the number and amount of grants going to digital projects for charities.

For that reason I’ll highlight two priorities (for limited resources), and two ways that we can make better collective use of funding for charity digital (to maximise the resources that are available).

Priority one: shorten the timeframe for influence

Many of the concerns about AI focus on extensional threat and huge long-term challenges. It’s important that we support vital organisations like the Ada Lovelace Institute, Open Data Institute, Connected by Data and Careful Industries to focus on the governance of AI over the long-term — driven by their social values and deep expertise in these areas. However for almost every other civil society organisation we need to focus on the opportunities, harms and negative consequences of technology that are happening now and in the short term. We need to prioritise where best to act and it’s critically important that we champion individuals and communities who are currently being negatively impacted by digital and AI.

Priority two: demand explainability and train those who need to understand how AI and computer-aided decisions are made

Some of the most significant challenges to the sector will be the loss of agency and power in championing those who are negatively affected by the use of technology and advocating on their behalf. Where decisions are automated and opaque, advocates cannot do their vital work. We need to support, test and champion the tools and approaches that demonstrate how algorithms are being used to make decisions to ensure they are as open and accessible as possible so that unintentional consequences and inappropriate decisions can be challenged. We also need to support the development in skills and capability of those who have this role — particularly advocacy-based roles. One significant change in understanding unintended consequences or bias within algorithms is that they are spotted over the long term by data scientists when they have enough data to identify patterns and trends. This shifts the role of advocating for change: from charities to data scientists; from providing direct support to individuals who are harmed to support cohorts once big enough; from immediate support from advocates to individuals, to longer term change once cohort analysis is complete. Whilst the work of investigative journalists and data scientists is critical, it shouldn’t replace the direct and personal work of advocates. (Which I misspelt as ‘advocats’ and I think that’s a much better term for the cool people doing this role)

Collective alignment one: reuse — maximises resources and sector influence

To make the best use of small resources and to increase our chance of influence we must align and work together. One element of this is focusing on the better sharing of learning, progress and outputs so that we can make collective progress. Open IP within grant-making provides greater opportunities for this sharing; focusing on reuse and encouraging reuse of sector and community owned tech aligns our work in a way that supports individual development and shared collective progress.

Collective alignment two: civil society as the most incredible untapped sensor network

Civil society organisations are our greatest untapped sensor network. Each node hears stories and examples of the challenges people face — they’re very often the first place to hear these stories — the first to sense new challenges and, in this instance, the negative consequences of technology.

Sensor networks quickly aggregate lots of small data points to identify trends, themes or weak signals. If we had a way to aggregate these individual points, we could more quickly find supportive responses for individuals (sharing best practices) but also more quickly evidence and address the systemic impact of technology.

Funders already have this portfolio view — though not the specific role to identify these trends. Second tier organisations (like NCVO and SafeLives) have this portfolio view but not always the specificalism to connect or focus on digital. Exploring mechanisms to capture and share these observations allows us to test out how we can move from individual charities working 1:1, to creating a sensor network that means we act collectively.

So what next? In this emergent space it’s important to keep sensing the impact of emerging technologies — positive and negative — so we can generate insight to inform opportunities to act and influence. We’ll share what we’re noticing and learning about the opportunities and challenges of AI— please share any examples you notice through your work too -perhaps we can begin to model the open, aligned activity of our influential civil society sensor-network.

--

--