Interview with Dan Sutch

EAAMO
EAAMO
Published in
6 min readMay 18, 2024

Dan Sutch is the co-founder and director of CAST, the Centre for Acceleration of Social Technology, where he works with charities and third-sector organisations to use and adjust to digital technologies to address big social challenges. Our EAAMO-Bridges working group, Conversation with Practitioners, had the pleasure of interviewing Dan in January 2024. In this blog post, we record insights and lessons we learned from the interview for our audience of researchers.

Dan Sutch — Director of the Centre for the Acceleration of Social Technology (CAST). Photo taken from: https://about.me/dansutch
Dan Sutch — Director of the Centre for the Acceleration of Social Technology (CAST). Photo taken from: Dan Sutch — CAST | about.me

Sakina Hansen and the Conversations with Practitioners Working Group.

Implementing Digital Design with the Most Social Impact

“We figure out how we can use digital design and data in the most positive social way, with the most positive social impact.”

Dan works with a broad set of organisations on a broad set of programmes. Sometimes this involves working with trusts and foundations to help them think about their role in supporting charities with their data needs, whether supplying the charities with extra support or using digital design in their own processes. For example, at CAST, they run a network of 35 trusts and foundations to help them understand the role of generative AI in their work. CAST wants to help funders fund projects and charities better.

Another type of work CAST does is to help specific charities incubate new digital in their products and services or in their research and discovery processes to identify where technology can be useful. For example, a current project involves CAST working with a network of 10 organisations in Manchester that provide services for asylum seekers and refugees by helping the organisations improve signposting and referral tools to provide quicker and better services. CAST also helps charities to deal with external uses of technology, particularly when new technologies make their work more difficult.

Skills Gap in the Charities Sector

“Changes in technology is making the gap between demand and support even bigger”

Dan spoke to us about how charities are constrained by resources and time. This is because of the nature of how charities are funded; they are funded by donations and thus, they run on goodwill and ambition, which is difficult to financially rely on. Money that comes in understandably often goes straight to the front lines to deliver core processes; as a result, not much money is invested in skills or innovation.

CAST also works with digital agencies and teams who want to use their skills to support charities. Dan explains that “engineering often doesn’t work well with how charities work”, so CAST tries to bring them together. For example, Catalyst provides technical resources to charities.

In the UK, we have seen a huge increase in the need for charity work recently, from food banks to international crises. Due to the lack of investment in skills, the changes in technology make the gap between support and demand even larger. With almost all emerging tech, the charity sector responds by partnering to gain support. They do not think about internal skills until the technology becomes very stable and widely adopted. Charities are therefore reliant on organisations and people outside the sector, such as consultants, external resources and direct support for technical needs. When considering upskilling their own processes, their priorities are improving communication and fundraising, organisational efficiency, and technology in service delivery, respectively.

Harmful Use of Tech and Algorithms

“Charities often pick up a negative consequence of technology, whether it be mobile phones, whether it be algorithmic decision-making. They are often the first to spot those negative implications, but they’re often not the best place to do anything about it.”

Charities see a lot of harmful use of technologies. For example, the charity SafeLives helps victims of domestic violence. They were aware, five or six years ago, that tracking features like Find My iPhone were new sources of stalking. However, the charity had no real way of engaging with technology providers or companies to manage the issue. Algorithmic decision-making is also a source of problems for charities. For example, a charity providing support for people applying for disability allowance could support recipients and advocate for them in the traditional way. Previously, they had a 100% turnover rate to negative decisions because they knew the rule book inside and out. However, with the allowance decisions shifting to be more algorithmic, they no longer have this skill and their 100% turnover rate is dropping to almost 0%. The increase in algorithmic decision-making makes it even harder for charities to respond to new harms because they are currently unable to engage with it.

Charities usually pick up negative consequences of technology very quickly but often are not the best placed to mitigate them. For example, when the Home Office deployed a harmful algorithm for visa approvals, getting the tool removed was largely down to investigative journalists and researchers rather than charities who worked with applicants themselves. This illustrates a shift from individuals finding immediate support from charities to address decisions, to data scientists and researchers addressing cohorts of people, when they have enough data, so not immediate support, but addressing broad issues for groups. These challenges can now only be addressed by experts, not charities. This raises a number of open questions for researchers and scientists: Can we provide tools to charities so they can continue working by providing individual support? Or do we need more partnerships where data scientists and researchers work directly on the issues to try and reach individuals and provide services directly to them.

More collaboration

The charity and third sector do not work together a great deal. Thus, it is difficult to gauge if academics and researchers’ best intentions are correct and helpful in practice. Researchers do not want to just look good on paper, thus, we need more partnerships. What kind of partnerships would be useful to charities on a day-to-day basis?

Recently, CAST conducted a survey of charities’ experiences with AI. An early finding is that AI and generative AI are misunderstood, so they cannot adequately address how they would affect communities. There are a set of concerns, broadly in three categories, that shape the impact on communities: who owns AI, the governance of AI, and the economic models that underpin it. On the other hand, there is also optimistic messaging about effective and creative work, identifying priorities and better fundraising. Additionally, Deloitte Digital Connect released a regular foresight paper. When this was shared with charities, they said it does not represent the trends in their small communities as charities have a specific focus, whereas many discussions of trends in AI are about general trends. Specific research into historically marginalized communities could be helpful as well as knowledge shares, particularly for the board of charities. Most charities are governed or controlled by a board who are all volunteers. These boards make strategic decisions but often do not have digital expertise, so answering their questions could be helpful. Some other ways Dan suggested to reach out to charities are through platforms such as Digital Candle, where volunteers respond to people’s tech issues, Reach Volunteering, where you can volunteer digital skills to help a specific charity or the Catalyst network that organises technical support to charities.

Within the charity sector, most of the challenges are described as wicked problems — problems where you can not easily create a linear argument of how to solve them because there are so many different factors. Sometimes it is not possible to solve the problem perfectly, but by using digital expertise and research, we can get better at solving the problem. The question is always: what next can we do to improve our confidence in our social impact?

We would again like to thank Dan for sharing insights, challenges, and obstacles observed in CAST, as well as Sakina Hansen and the Conversations with Practitioners members for their engagement and thoughtful questions.

--

--

EAAMO
EAAMO
Editor for

EAAMO is a multi-institutional, interdisciplinary initiative working to improve global access to opportunity. Learn more at eaamo.org