A digital power glossary

Rachel Coldicutt
Careful Industries
Published in
11 min readJun 1, 2022

What do the words we often use to talk about digital power really mean?

Four boxes: Box 1: Exclusion in an exclusionary system — Digital injustice, Digital poverty, Digital exclusion, Algorithmic bias, Digital equity; Box 2: Inclusion in an exclusionary system — Digital access, Digital inclusion, Digital ethics, Debiasing data, Algorithmic fairness; Box 3: Recognising power imbalances in an exclusionary system — Digital equity; Digital equity; Box 4: Redistributive and liberatory systems — Digital Justice, Digital Decolonialism, Indigenous Data Governance
Terms used to describe digital power, arranged in 4 groups

This glossary offers definitions for and groups some of the terms commonly used to describe the intersections of power and digital technologies. In creating it, we found that some terms are more likely to be used in conjunction with projects that maintain the status quo, and others might be more commonly used to inspire and imagine the creation of new futures.

There’s some preamble here — explaining why we made it, and how we grouped it — but if you want to get straight to the glossary, scroll down the page.

If you want to know more about the context, then read on. Some influences and hopes for the future follow at the end.

About the glossary

Some of these terms might seem interchangeable — and perhaps they have become interchangeable in your own practice. After all, it’s really difficult to pick a term and stick to it, particularly when the orthodoxy of “good style” is that words shouldn’t be repeated again and again, particularly in written text. However, the more we explored the concept of how language relates to digital power, the clearer it became that choice of words and their meanings matter a great deal.

All the definitions given below were drawn from recent studies and scholarship and then grouped according to where they sit in relationship to power and commitment to change.

While this is by no means a full bingo card of the most common terms, it demonstrates that different phrases that might feel as if they are in the same family actually mean different things and convey different intentions. For instance, when something is open it is more accessible than something that is closed — but simply being open is not the same as being accessible or redistributive.

Why did we make it?

This glossary was originally created as part of a client project to show the importance of matching terminology with purpose and intention. We’ve been adding to it since last year, and are sharing it because it feels like something that might be useful (or, at least, thought provoking) for others doing similar work.

Much of our work across both Promising Trouble and Careful Industries is about turning theory into plans and practice — for instance, working out how to roll out free Internet access for people who live in urban social housing, or helping a health-research organisation develop an actionable approaches to data ethics — so accurate, accessible language is extremely useful and important to us.

When we partner with an organisation or take a commission, we want to understand quickly what they want to achieve, and how our values align. We’re not here to validate the status quo, but to create change: our social enterprise arm is called Promising Trouble because, as Donna Haraway puts it, it’s important to “stay with the trouble” — in our case this means it’s important to encourage technologies to be used and created for liberatory and redistributive purposes.

About the grouping

The terms we selected are collected into four groups:

  1. Terms that represent how people are excluded in an exclusionary system
  2. Terms that represent how people are included in an exclusionary system — for instance, programmes that provide short-term relief within current norms but do not create a new paradigm
  3. Terms that recognise power imbalances within an exclusionary system — programmes that seek to make adjustments to the current paradigm
  4. Terms that indicate progress towards redistributive and liberatory systems — work committed to creating new norms

As every technology ethicist knows, all taxonomic structures are the expression of an ideology, and this structure supposes that redistributing power is a good thing for everyone. It comes from a place of technology pragmatism and an understanding that people, not technologies, are responsible for social change. Some of the influences on this thinking are shared at the end of the post.

THE GLOSSARY

All sources are given at the end and linked from the definition

Box 1: Exclusion in an exclusionary system — Digital injustice, Digital poverty, Digital exclusion, Algorithmic bias, Digital equity
Terms that represent how people are excluded in an exclusionary system

1: Terms that represent how people are excluded in an exclusionary system

Data injustice: the practices through which an individual or community experience invisibility, hypervisibility, or lack of agency to determine their engagement, and/or discrimination when accessing or using tools and technologies (Linnet Taylor, see note 1)

Data poverty: “By data poverty we mean, those individuals, households or communities who cannot afford sufficient, private and secure mobile or broadband data to meet their essential needs.” (Adam Lang et al., see note 2)

Algorithmic bias: the encoding of new and existing unfair inclinations based on characteristics, arbitrary factors, or otherwise inappropriate basis (Centre for Data Ethics and Innovation, see note 3)

Digital exclusion: “the unequal access and capacity to use information and communication technologies (ICTs) that are seen as essential to fully participate in society” (Schejter et al, see note 4)

Digital inequality: “Digital inequality refers to differences in the material, cultural and cognitive resources required to make good use of information and communication technology (ICT)” (OECD, see note 5)

2: Terms that represent how people are included in an exclusionary system

Box 2: Inclusion in an exclusionary system — Digital access, Digital inclusion, Digital ethics, Debiasing data, Algorithmic fairness
Terms that show inclusion in an exclusionary system

Digital access: “The access doctrine decrees that the problem of poverty can be solved through the provision of new technologies and technical skills, giving those left out of the information economy the chance to compete.” (Daniel Greene, see note 6)

Digital inclusion: “For the UK to be a world-leading digital economy that works for everyone, it is crucial that everyone has the digital skills they need to fully participate in society” (Department of Digital, Culture, Media and Sport, see note 7)

Algorithmic fairness: “ensur[ing] that algorithmic decisions do not create discriminatory or unjust impacts when comparing across different demographics (e.g. race, sex, etc)” (FAT ML, see note 8)

Debiasing data: “the practice of attempting to remove new and existing unfair skews in Facts, details, statistics, or any information collected together for reference or analysis” (Our Data Bodies, see note 9)

Digital ethics: A practice rather than a set of values that determine the social, economic, political, and environmental impact of technologies

3. Terms that recognise power imbalances within an exclusionary system

Box 3: Recognising power imbalances in an exclusionary system — Digital equity; Digital equity
Terms that recognise power imbalances in an exclusionary system

Digital equality: the condition in which all members of a community have the same access and knowledge to use tools and technology

Digital equity: the condition in which all individuals and communities have the information technology capacity needed for full participation in our society, democracy and economy (National Digital Inclusion Alliance, see note 10)

4. Terms that indicate redistributive and liberatory systems

Box 4: Redistributive and liberatory systems — Digital Justice, Digital Decolonialism, Indigenous Data Governance
Terms that indicate redistributive and liberatory systems

Digital justice: systemic practices that ensure all members of a community have equal access to media and technology, the skills and knowledge to create their own technology and engage in decision-making processes regarding the technology they use, hold common ownership over the tools and technologies they use, and opportunities to generate community-based solutions to problems (based on Detroit Digital Justice Coalition Principles, see note 11)

Digital decolonialism: “Decolonialilty is not merely diversity and inclusion; removing the echoes of coloniality in AI will require reparations for present and past material and epistemic injustice and dispossession… our aim is to create and hold a resonant forum for learning and exchange from and between voices silenced by colonialist structures and the coloniality in force through socio-technical systems” (The AI Decolonial Manyfesto, see note 12)

Indigenous data governance: “the right to create value from Indigenous data in ways that are grounded in Indigenous worldviews and realise opportunities within the knowledge economy. The CARE Principles for Indigenous Data Governance are people and purpose-oriented, reflecting the crucial role of data in advancing Indigenous innovation and self-determination. These principles complement the existing FAIR principles encouraging open and other data movements to consider both people and purpose in their advocacy and pursuits.” (“The CARE Principles for Indigenous Data Governance”, see note 13.)

Some influences

Recent re-framings of the terms “access” and “inclusion” by US academics Daniel Greene and Anna Lauren Hoffman have helped to pinpoint what can feel uncomfortable about the impact of some digital inclusion initiatives. I write this as someone who was instrumental in closing a digital-inclusion programme in 2016; at that moment it felt possible to create genuinely systemic change by advocating for better rights and legal protections. Of course, improving people’s skills is important, but others were already doing that, and our aim was to try and create better systems for more people, rather than giving more people better skills to engage with broken systems.

Hoffman’s 2021 essay “Terms of Inclusion: Data, Discourse and Violence” eloquently makes the point that inclusion programmes of all kinds often uphold the status quo by inviting alienated peoples to take part in the structures that have oppressed them. Hoffman quotes Ruha Benjamin, who writes in Race After Technology that inclusion is often:

part of a larger repertoire of ‘happy talk,’ which involves a willingness to acknowledge and even revel in cultural difference without seriously challenging ongoing structural inequality.

This is particularly the case with data and digital technologies, which might then be deployed as surveillance and monitoring tools, or as a means of capturing data to power potentially discriminatory decisions.

Daniel Greene’s book The Promise of Access is a study of digital access programmes in the US, but many of the findings are applicable in the UK too: Greene unpacks the ways that “digital access” has often been identified as a hopeful cure-all for social and economic mobility. He calls this “the access doctrine” and identifies numerous ways in which “the problem of poverty [has been made] a problem of technology” — even while those same technologies have played a role in entrenching traditional power and social stratification.

These tensions have been particularly apparent during the pandemic, when digital-inclusion programmes have provided essential lifelines and resources for many — but the time and space has rarely (if ever) been created for those same programmes to identify and deliver the changes that would make their missions obsolete.

Towards plural and regenerative futures

Some of the many scholars and practitioners that have influenced this work are linked in the notes below, but the thinking here has been developed in dialogue with and consideration of the thinking of many scholars and practitioners. Some particular influences, who have made it more possible and urgent to challenge the concept that “inclusion” is enough, include:

Much of our work at both Careful Industries and Promising Trouble explores how technologies can enable plurality: rather than removing diversity or showing a single truth, technologies and data standards can be created in ways that reflect and enable multiplicity. The early Web concept of “small parts loosely joined” can be reimagined to be a method of celebrating and reflecting the multiplicity of humanity; of connecting and amplifying, rather than colonising and standardising. These are concepts we are exploring in such different fields as developing a new method of relational foresight and, with Power to Change, supporting the development of place-based technologies for communities.

The CARE Indigenous Principles for Data Governance (cited above) are a particular source of hope and inspiration. They offer practical steps for using data in ways that deliver collective benefit — prioritising governance, self-determination and reciprocal responsibility. Rather than setting out steps for including Indigenous Peoples in the digital status quo, the principles set an expectation for creating the status quo based on Indigenous Peoples’ needs — for sharing data as a conduit to also sharing power and decision making and maintaining plurality. Some inclusion programmes inadvertently maintain a model of the past; this seems a truly inclusive and innovative way to shape the present and the future.

Our hope in sharing this glossary is to encourage more inclusion-focussed programmes to look beyond the present: to engage in shifting entrenched power and designing (at the least) equitable systems and pushing beyond that for just and redistributive outcomes.

The present moment is a great inflection point: as many of us angle towards designing for planetary justice, there is an opportunity to step away from trickle-down strategies that entrench the inequalities of the status quo and start exploring regenerative approaches in which data and technologies enable real power-sharing and accountability.

Notes

(1) Linnet Taylor, “What Is Data Justice? The Case for Connecting Digital Rights and Freedoms Globally,” Big Data & Society 4, no. 2 (December 1, 2017): 2053951717736335, https://doi.org/10.1177/2053951717736335

(2) Adam Lang et al., “Defining Data Poverty,” nesta, accessed April 10, 2021, https://www.nesta.org.uk/project-updates/defining-data-poverty/.

(3) Centre for Data Ethics and Innovation, “Review into Bias in Algorithmic Decision-Making,” GOV.UK, November 7, 2020, https://www.gov.uk/government/publications/cdei-publishes-review-into-bias-in-algorithmic-decision-making/main-report-cdei-review-into-bias-in-algorithmic-decision-making.

(4) Schejter, Amit, Ben Harush, Orit Rivka, & Tirosh, Noam (2015) Re-theorizing the “digital divide”: Identifying dimensions of social exclusion in contemporary media technologies. In FACE Conference: European Media Policy 2015: New Contexts, New Approaches, 2015–04–09

(5) OECD (2015), “Inequalities in Digital Proficiency: Bridging the Divide”, in Students, Computers and Learning: Making the Connection, OECD Publishing, Paris. DOI: https://doi.org/10.1787/9789264239555-8-en

(6) Daniel Greene, The Promise of Access: Technology, Inequality and the Political Economy of Hope (MIT, 2021)

(7) Digital Skills and Inclusion — Giving Everyone Access to the Digital Skills They Need,” (DCMS, GOV.UK), accessed October 4, 2021, https://www.gov.uk/government/publications/uk-digital-strategy/2-digital-skills-and-inclusion-giving-everyone-access-to-the-digital-skills-they-need.

(8) “Principles for Accountable Algorithms and a Social Impact Statement for Algorithms :: FAT ML,” accessed December 1, 2021, https://www.fatml.org/resources/principles-for-accountable-algorithms.

(9) Our Data Bodies, “Digital Defense Playbook,” Digital Defense Playbook (blog), accessed December 1, 2021, https://www.odbproject.org/tools/.

(10) “Home — National Digital Inclusion Alliance,” accessed December 1, 2021, https://www.digitalinclusion.org/

(11) Detroit Digital Justice Coalition, “Digital Justice Principles” accessed December 1, 2021

(12) Aarathi Krishnan et al., The AI Decolonial Manyfesto, accessed December 1, 2021

(13) S. R. Carroll and M. Hudson et al., “The CARE Principles of Indigenous Data Governance”, The Global Indigenous Data Alliance (2018), https://www.gida-global.org/care

--

--

Rachel Coldicutt
Careful Industries

Exploring careful innovation, community tech and networked care. Day job: @carefultrouble .