Telescope on technology

Transforming tech design and policy to tackle TGBV

Naomi Alexander Naidoo
Orbiting
Published in
6 min readJul 13, 2022

--

Orbits is a field guide on how to tackle tech abuse through interventions that are intersectional, trauma-informed and survivor-centred. It focuses on three areas which are vital to effectively addressing tech abuse — technology, research, and policy — and proposes a set of design principles to design more impactful interventions.

In this blog, we’ll focus on technology, and summarise why it’s an important part of the solution, what is wrong with technology design and governance as it currently is; and how technology can become a tool for healing rather than harm.

Why focus on technology?

To address technology-facilitated gender-based violence (TGBV), first and foremost we must look at technology. As the tools through which tech abuse is carried out, the design and governance of tech products and services is instrumental to how abuse is perpetrated and how it can be stopped. In Orbits, we look at both technology design and ‘little p’ policies — the internal policies, practices, and guidelines of tech companies that regulate their community.

What is technology getting wrong?

Technology is often designed without considering how it may be used to cause harm and, as a result, has inadequate or non-existent safeguards and support mechanisms. Technology design often replicates the systems of oppression of wider society and amplifies existing inequalities, and reporting and remedial processes are often inadequate, inaccessible, and retraumatising.

There are several features of tech platforms that enable or facilitate tech abuse. While these features are not designed for abusers — they are usually designed for other, valid reasons such as user experience or efficiency — these vulnerabilities can be easily exploited to cause harm. Design features that are common to many problems, and can be used to abuse, include limited user choice about what information is made public (most social media platforms make some personal information publicly available, which can be used by perpetrators to identify, harass, and stalk survivors) and rigid and hard-to-find privacy settings (while most platforms do offer a variety of options for privacy, these are often inflexible and do not allow people to personalise their privacy preferences so survivors are torn between risking their safety or completely privatising their account, which might have other professional or social consequences). Across many platforms, the tools and processes to report abuse are not easy to find or use, are often slow, and may not be available for some languages at all. Furthermore, algorithms frequently fail to flag abuse, even when it’s reported, and when human teams are working on abuse reports, they can fail to recognise and appropriately deal with abuse due to a lack of training and context-specific knowledge. This issue is particularly pertinent in the Global South, as without sufficient cultural knowledge and training, moderators often do not recognise abusive content as abuse.

Many tech platforms also have vulnerabilities that are specific to their product. For example, iCloud makes it easy for perpetrators to take over multiple devices and access content, contacts, and more. Snapchat maps enable and encourage the sharing of location data. Facebook groups are used extensively to coordinate abuse. YouTube hosts channels for perpetrators seeking advice, guidance, and techniques to help them abuse. Features such as ‘story views’ on Instagram and ‘viewed your profile’ on LinkedIn can be used by stalkers to communicate that they are watching.

Underpinning these design and policy inadequacies, there are systemic causes and structures that create favourable conditions for abuse to flourish and lead to inaction from tech companies. While these systemic issues are not the focus of Orbits, they must be acknowledged to fully understand the lack of progress from tech companies on TGBV. Systemic issues underlying tech abuse include prioritisation of issues and regions (where tech abuse is not prioritised, especially in non-priority markets); tech business models (that are built on engagement, whether that engagement is built on positive communities or hate speech); power asymmetries (where the big platforms have grown ‘too big to fail’, and are unaccountable to governments, civil society and the general public); and a lack of diverse teams and leadership in technology companies (meaning the concerns of marginalised groups are easy to ignore).

How can we transform technology?

While the foundational issues that enable tech abuse to flourish require long-term, systemic change (and we shout out many of the amazing initiatives, such as Mozilla Foundation and the open source movement, leading this change in Orbits), there are also a lot that tech companies can do now to combat abuse through intersectional, survivor-centred and trauma-informed product and policy design. When designing online tools, we need to approach it as though we are designing a physical space — say, a cafe. What do we want people to think about when they stand on the street, looking at our cafe window? What would it feel like if they stepped inside? Would they want to take a seat and linger, or would they want to quickly grab something they need and leave? Do they feel like they can do both depending on their mood and routine? Applying an intersectional, trauma-informed, and survivor-centred lens presents us with new questions to consider. To ensure the cafe is inviting and comfortable for a wide variety of people with different needs and life experiences, how might we alter the design? If we know that the cafe will welcome survivors who have experienced trauma, what might we change or add to its design?

The Orbits principles in practice

Icons of the eight trauma-informed design principles: safety, agency, equity, privacy, accountability, plurality, redistribution of power, and hope

We can apply the Orbits principles to technology design and policy to see what this might look like. Below are a few examples how tech companies can employ the Orbits principles to tackle TGBV.

1. Safety

  • Testing all technology for abusability by conducting threat modelling at multiple stages of the design lifecycle. See Trust and Abusability Toolkit (pdf)
  • 2 Factor Authentication
  • Safety exit button on websites that take users to a non-conspicuous website in case someone is watching them. To support emotional safety, consider redirecting to something comforting instead

2. Agency

  • Offering tools that people can customise and use at their own pace
  • Creating flexible mechanisms that enable people to describe their own experience and share the remedial measures they wish for, rather than forcing reports into rigid, predetermined categories
  • Building room for consent at various stages, especially in reporting processes

3. Equity

  • Ensuring strong referral pathways to specialist services for survivors from marginalised communities
  • Introducing voice-activated reporting mechanisms to account for different literacy levels and the diverse technology needs of different communities
  • Rolling out new safety features simultaneously in all low and high-income countries

4. Privacy

  • Clearly indicating what data is publicly accessible and what isn’t
  • Maintaining strict confidentiality for reporting processes
  • Withholding survivors’ details from the perpetrator during taking of any punitive actions

5. Accountability

  • Communicating to survivors which department deals with the report work and informing them that there is a dedicated and specialist resource to handle reports
  • Actioning user research and feedback in design
  • Being clear about the hours of your service or the boundaries of your support

6. Plurality

  • Training moderators to understand cultural context
  • Refraining from assuming which language is spoken based on location
  • Recognising that people in digital spaces might experience multiple forms of discrimination/hate (for example, gender and race discrimination). Therefore, in complaint processes, it should be possible for survivors to identify multiple offences, including offline ones

7. Power redistribution

  • Giving survivors decision-making power in tech companies through compensated board or committee positions
  • Consulting communities through different stages of research, design, and implementation
  • For global firms, using local teams and networks to gather ideas for ways to improve services

8. Hope

  • Using an empathetic tone in written and vocal communications
  • Displaying simple, soothing, and visually appealing UX
  • Taking proactive and communicative steps to stop tech abuse (For example: flag and/or blur offensive content and create digital fingerprints to block uploading of flagged content)

These are just a few of the examples of what the Orbits principles look like when applied to technology. Read many more in Orbits. You can download the full guide at c.chayn.co/orbits or read the technology chapters here.

--

--