Practitioners test the Orbits v.1 model for trauma-informed research and tech design

We brought together researchers, activists, campaigners and technologists from around the world to explore, develop and refine our principles for trauma-informed research and technology design to end tech abuse — here’s what happened.

Naomi Alexander Naidoo
Orbiting
Published in
6 min readAug 17, 2021

--

Photo by Daniel Olah on Unsplash

The Orbits journey — led by Chayn and End Cyber Abuse (ECA) — is progressing at pace, as we venture to our end destination of producing a field guide on intersectional, survivor-centric interventions to end technology-facilitated gender-based violence (TGBV). The guide will focus on interventions across research, policy and technology. Since the beginning of 2021 when we began this voyage, we’ve mapped gaps in existing resources and best practice; interviewed experts from around the world to understand what TGBV looks like in their context and how they are tackling it; held workshops at Mozfest and Rightscon to explore ideas for intersectional interventions; and documented our own practices based on Chayn and ECA’s work to date, producing version one of trauma-informed design principles for technology, research and policy.

Orbits: the journey so far

For the next stage of our Orbits adventure, we hosted a series of consultation workshops with people working to end tech abuse across the globe. The first three workshops each looked at one of the key areas of the guide — research, tech and policy — while the final workshop was exclusively for survivors, to ensure that their experiences and insights are given primacy in the final version of the guide. Here, we’ll share a summary of the conversation and key takeaways from our first two workshops on research and technology. Both workshops explored version 1 of Chayn’s trauma-informed principles for research and technology, and got participants’ feedback on each of the principles, as well as the model as a whole.

Chayn’s trauma-informed design principles, version 1

Research

To begin to tackle TGBV we have to understand it, and that takes research. But, too often, research into GBV is extractive and retraumatises survivors. As Chayn’s Founder Hera Hussain has written, ‘When survivor insights are treated like an asset but their own agency in the process isn’t, it’s extractive. When interviews force survivors to disclose trauma in gory detail when there is no need for it, it’s extractive. When survivors are consulted but have no idea of why and how their experience will be used, it’s extractive. When questions aren’t asked with the understanding that trauma might elicit leading responses, it’s extractive. When language, culture, race, disability and other characteristics aren’t considered even when survivors mention it, it’s extractive. When survivors are asked for their opinion but the end product remains unchanged, it’s extractive. When consent is assumed and not explained and asked, it’s extractive.’

Our first workshop set out to explore how research might be different, and how we can create research environments which are trauma-informed and survivor-centric, rather than extractive. We were joined by researchers from all over the world, exploring a range of topics related to TGBV including how women recover after experiencing TGBV, and techniques to help with their trauma recovery; reporting processes for GBV for trans women in the UK; how targeted TGBV is coordinated across platforms; and how online discourses have fuelled the backlash against Muslim women’s right to citizenship. We asked them for feedback on Chayn’s model for non-extractive research and if/how the model might support their work, as well as crowdsourcing ideas for non-extractive research practice. Here are some highlights of the discussion:

Feedback on principle model:

  • The number of principles is somewhat overwhelming and may be difficult to work with, so we should explore how to group, consolidate or reduce them.
  • It’s imperative to think about how the model applies to quantitative research and working with big data.
  • The principles work well with ethical principles of research and existing good practice in this space.
  • All principles are broad and open to interpretation, and will apply to different contexts in different ways.
  • Hope is a foundational principle and is important for all the others to work, however we must be careful in applying it, especially in ensuring that we do not offer false hope.

Ideas for non-extractive research practice:

  • Consider and capture the context of the experience
  • Allow people to choose their own pseudonyms
  • In surveys, allow participants to share experiences in their own words and add their own options to multiple choice questions
  • Carry out grounding exercises during research activities
  • Consider not doing the research at all — think about when can use existing research and/or use proxy users instead

Technology

Our next workshop looked at how these principles apply to technology design, and how we can create trauma-informed technology that tackles, rather than facilitates, TGBV. We began the workshop by considering tech vulnerabilities — the features of tech platforms and companies that enable harm. Building on a previous workshop that we held at Rightscon, we added to our tech vulnerabilities to map.

Vulnerabilities in tech platforms that allow or enable abuse

We then got feedback on the principle model in the context of technology design, and gathered ideas for how tech platforms can effectively mitigate and tackle TGBV in a way which is survivor-centric, trauma-informed and intersectional. Here are some of the key takeaways from the discussion:

Feedback on principle model:

  • We need to be prepared for tech companies to dilute or diffuse the principles, or claim them without actually following them. Concrete examples, tangible actions or measurement mechanisms for each principle may help to counter this.
  • Tech companies need to move from a blame model to a healing model — interventions should not focus solely on punishing perpetrators, but also supporting survivors.
  • There are a lot of tensions around friction and privacy in tech design — friction is important for mitigating abuse, but too often the friction impacts those affected by, rather than committing, abuse. Equally, privacy that is vital for survivors can be used and abused by perpetrators.
  • The business models of tech companies may be in fundamental tension with these principles.

Ideas for mitigation techniques:

  • Educating users during onboarding (and at intervals after that) about what abuse looks like, so they know how to recognise and report it.
  • There should be transparency about what happens in the reporting process, how feedback is incorporated, and who makes up the teams who make decisions.
  • Take the burden of reporting off the user by making it very easy and/or allowing them to designate someone else to report on their half, without giving full access to their account.
  • Option to lock images so that they cannot be shared.
  • Supporting smaller/independent tech platforms with different governance structures.

We’re so grateful to all our workshop participants for joining us and sharing their rich insights, perspective and ideas, only a fraction of which could be captured here. We’ll be using this input, as well as input from our other workshops, to refine our principles and produce a final version for the Orbits field guide. Watch this space…

--

--