You tested our models of tackling tech-facilitated gender-based violence at RightsCon

Here’s what you said.

Naomi Alexander Naidoo
Orbiting
Published in
5 min readJun 22, 2021

--

Photo by Brian McGowan on Unsplash

Orbits is a joint initiative of Chayn and End Cyber Abuse, producing a field guide to tackle technology-facilitated gender-based violence (TGBV). Find out more here.

In June, our Orbit’s voyage took us to RightsCon — the world’s leading summit on human rights in the digital age, organised by AccessNow. As the next stage of our quest to produce a field guide for ending TGBV, we held a Community Lab session to share our progress on the project so far, and get input and ideas from RightsCon attendees. You can view the slides for the workshop here.

We were joined by around 20 participants including researchers, activists, campaigners, UX designers and more from different parts of the world. We shared an overview of Orbits, our journey so far, and version one of our trauma-informed principles for design and elements of our policy framework. We then split into breakout rooms to explore further the particular ways tech platforms enable abuse — and can mitigate it — and sought ways in which the elements of our policy framework could be best utilised for survivor-centered outcomes.

We’re sharing this with you so even though you were not there, you can send us your thoughts through a reply to this blog or by emailing Naomi, our Movement Builder.

What technology design vulnerabilities allow abuse to happen, and how can they be mitigated?

Ideas on tech vulnerabilities and mitigation methods from RightsCon workshop participants. Background photo by Ahsan Avi on Unsplash

We had a lively discussion about the many, many ways that tech platforms enable harm. Key tech vulnerabilities highlighted include:

  • Reporting mechanisms are very English or euro-centric, meaning both algorithms and staff teams often fail to appropriately deal with abuse that happens in non-European languages.
  • Tech companies often assume a certain level of knowledge or digital literacy among their users (even if they are super users), resulting in the safety and privacy settings of their products being inaccessible for many communities, especially in rural areas.
  • There is a lack of human support; it’s exceptionally difficult to get a person on the phone to speak to, particularly in markets which are not seen as a priority to tech platforms.
  • Many tech platforms have a risk aversion for trying new methods or ways of working to tackle this issue, for example co-design. Even when these methods are employed, the people of colour often consulted are those that are based in the Global North, who do not have the same life experience and risks that those living in Global South countries are.
  • Moderation of harmful content is often carried out by under-paid, exploited ghost workers, who are themselves harmed through doing what can be very traumatising work.

We also identified particular vulnerabilities of specific tech platforms, for example:

  • LinkedIn is often used for workplace harassment. It is also difficult for survivors to balance the privacy concerns of having a profile, with the potential career impacts of not using the platform.
  • iCloud is regularly compromised by stalkers and abusers to access files and accounts and even takeover devices.
  • Facebook groups are often used for coordinated harassment. Groups for survivors can feel like a safe space, but are easy to infiltrate.
  • YouTube hosts videos where abusers share tips and techniques for others to follow.

It was harder — as expected, because these things are tough — to come up with mitigation strategies. Mitigation is hard not only because the same things that can protect survivors can also curb the rights of those same survivors. A good example for that is anonymous accounts. Being able to be anonymous online allows women in oppressive societies and families to create accounts and live free lives online or for folks who have been stalked to do the same; but this can also enable the same harassers and stalkers to harm them undetected and unpunished.

Some of the ideas we came up with are as follows:

  • Collecting comprehensive data on abuse and who it affects with an intersectional lens, and making this data publicly accessible.
  • Inclusive testing, with long term impact in mind.
  • Get tech platforms to recognise and responsibly account for how civil society and survivors are using these platforms to share information, stories and prevention techniques.

How can policies adequately address tech-facilitated gender-based violence without re-traumatising survivors?

The second breakout group looked at our first iteration of the policy design framework, exploring how it might work in practice and key challenges. The group examined individual elements of the framework such as consent, confidentiality, intersectionality, and accessibility to comment on their strengths and weaknesses in light of specific circumstances.

End Cyber Abuse’s policy design framework

Here are some key take-aways from our discussion:

  • Accountability and a lack of confidentiality is a major challenge, as many laws around the world criminalise survivors rather than the perpetrators. Practical considerations make investigation and persecution tougher. There are gaps in the law around platform liability as well as identification and accountability for creators of stalkerware and other invasive forms of tech used to facilitate abuse; it can be incredibly difficult to identify those responsible, who benefit from the anonymity not afforded to survivors.
  • Accessibility is key, as in the digital security field, we tend to use technical language that excludes many women and other marginalised individuals from being able to participate. Tech platforms should make resources and information available to all in accessible forms and clear language, without technical jargon. Care must be taken that policies are not created with only the “able-bodied and privileged” individual as the target consumer.
  • For policymaking to be intersectional, local and regional contexts and the lived experiences of individuals should take centre stage in policy making. Rather than a one-size fits all approach, detailed and tailored policies that reflect the actual and diverse experiences and realities of survivors should be created.
  • Decentralisation is important. Governments in an ideal situation should consult with survivors and impacted people, but in reality, governments and policymakers are both far away from the people affected. There is thus a great risk of tokenization, as stakeholders including the government and tech companies bring in civil society to check a box rather than involving them (or survivors themselves) in deep, meaningful consultation, engagement and co-creation.

Where to next?

Thank you so much to all workshop participants for their enthusiasm and contributions, which will be incorporated into the Orbits field guide. For the next stage in our voyage, we’ll be hosting a series of workshops that explore further what trauma-informed, survivor-centred and intersectional practice looks like across policy, research and design. Find out more and sign up here.

--

--