How can we Ensure Online Safety for Minority Groups?

CivicTech Contributor
Civic Tech Innovation Network
4 min readJun 19, 2024

Written by Sinenhlanhla Kheswa and Yasmin Shapurjee

With our online and offline lives becoming increasingly integrated, along with the use of smartphones and the internet, digital targeting has become easier. Digital targeting may be defined as the intentional tailoring of specific content or messages aimed at a certain group of people based on their personal or political interests, demographics, online behaviours, etc. Some social groups, such as the LGBTQIA+ community, women, politically oppressed people, and activists, are actively targeted, persecuted, and exploited through digital technologies both online and offline. There is an urgent need for tech companies to prioritise user safety, particularly for minority groups, and for civil society organizations to advocate for more transparent and effective policies to combat digital targeting and violence.

Protecting minority groups from digital targeting was the focal point of a recent panel discussion in the Digital Dialogues series by the International Civil Society Centre and Civic Tech Innovation Network. The panellists, Dr. Nyx McLean, a Research Associate at Rhodes University in the School of Journalism and Media Studies, and Rasha Younes, interim deputy director of the LGBTQ+rights program at Human Rights Watch; explored various aspects of digital targeting, highlighting the challenges faced by minority groups and proposing potential solutions to address these challenges.

According to Dr. McLean, major tech companies, including Big Tech (such as Meta and Google) and smaller start-ups, often prioritise profit over the well-being of users. This apolitical stance, under the guise of “tech neutrality,” often reinforces violent hegemonies, leaving marginalized groups vulnerable to various forms of abuse. “Tech platforms wilfully ignore the vulnerable and have no sense of social justice or ethics of care around the users and how they could be harmed by these platforms.”, said Dr. McLean.

An example of this made by Dr McLean was the Left-Out Project, which was a project that explored transgender, non-binary, and gender diverse (TNBGD) experiences of online gender violence in Botswana, South Africa, Rwanda, and Uganda. The participants in the project reported that their instances of violence on platforms such as X, formerly known as Twitter, were not treated with a sense of urgency; they were either not responded to, had their accounts taken down, or outright dismissed. Having carried out this project, Dr Nyx shared that the highlight was the positive impact of solidarity strategies amongst those affected by this digital targeting. These strategies include mass reporting of posts and accounts that inflict violence on the TNBGD people. “Platforms would respond more readily because they were receiving mass reporting, said Dr Nyx.”

Younes shared the same sentiments with Dr. McLean, that tech companies underestimate the role that platforms play in facilitating abuse. In a 2023 report titled “All This Terror Because of a Photo”, the Human Rights Watch documented offline consequences of digital targeting by security forces in Egypt, Iraq, Jordan, Lebanon, and Tunisia. It demonstrated that online abuses against the LGBTQIA+ community are not temporary, and the consequences have continued effects on the quality of their lives. Members of the LGBTQIA+ community reported losing jobs, experiencing family violence, having to change residence, and facing severe mental health consequences due to online targeting. Rasha further reiterated Nyx’s stance on how Meta platforms often fail to respond to complaints from the LGBTQIA+ community or find reported content does not violate its policies, thereby perpetuating the abuse, as “the content remains online,” added Younes.

As a follow-up to the report, Rasha shared that they started a campaign called ‘Secure our Socials,’ which aims to engage tech companies, particularly Meta, to improve user safety for the LGBTQIA+ community in the Middle East and North Africa region. The campaign further calls for Meta to disclose its annual investment in user safety and security and, most importantly, to improve and be more transparent in content moderation practices. “Meta’s over-reliance on automation when assessing content and complaints also undermines its ability to moderate content in a manner that is transparent and lacking bias,” said Younes.

Younes further shared that, in an attempt to address these violations, Human Rights Watch developed an awareness tips document for the LGBTQIA+ community, which shares tips on how to spot digital targeting and how to respond in such instances.

In closing, the panellists both shared their thoughts and aspirations on the matter. Dr. McLean suggested that technology startups should prioritise collective care and organising, ensuring that technology development is driven by the needs and insights of the people who use it. They added that it is important for society to consider the consequences of tech, and to “think carefully about the tech we use.”

Watch the full dialogue here.

Here are a few links to Dr. McLean’s work that might be of interest:

More academic items can be found on Dr. McLean’s Google Scholar profile: https://scholar.google.com/citations?hl=en&user=cLz38cQAAAAJ&view_op=list_works&sortby=pubdate

See links to campaigns from Rasha Younes:

--

--