Launch of the 2024 Community of Practice on Human Rights & Technology
Tech Legality kicked off our first Community of Practice on Human Rights & Technology session last week. We are following the open access course materials from the Trust & Safety Teaching Consortium. Last week we discussed some of the readings which focused on the origins of trust and safety as a profession and emerging discipline, and we looked at how the terms and definitions used to describe online harms relate to those used in human rights law and other legislation. We also asked what human rights law says about the duty to accommodate not just the majority, but also vulnerable minorities, even if they make up very few of a company’s customers.
Our routine is to go into breakout sessions of around 8–10 people to discuss one or two questions that centre the reading materials in human rights law and practice. Our Community of Practice now consists of over 75 people, with a really rich mix of expertise, life experience, and geographic reach — so far we have people from Kenya, Zimbabwe, Chile, United States, Turkey, Germany, Pakistan, Portugal, Spain, Italy, United Kingdom, France, Nigeria, the Netherlands, Ireland, India, Indonesia, South Africa, and Denmark. So these breakout rooms feel like quite an awe-inspiring hive mind to draw on!
The first reading in the Trust & Safety curriculum was by Ronald Robertson, on Uncommon Yet Consequential Online Harms. Robertson points out that “a narrow focus on main effects in the population as a whole almost necessarily means a focus on effects in the group with the greatest numerical representation”, and that this leaves out more vulnerable people and minorities. Robertson invites the research community to think about the “curb-cut effect” which has shown that laws and programs designed to benefit vulnerable groups can end up benefiting all of society.
For the purposes of the Community of Practice we supplemented this reading with a key piece by Afsaneh Rigot, Design From the Margins. Rigot gives some really great examples in this text about how designing for the ‘edge cases’ can improve user experience for everyone. For example, she led a project and research with Article 19 and local organisations in Iran, Lebanon, and Egypt, where they centred and worked with MENA LGBTQ communities, many of whom were from doubly marginalised identities. They worked with Grindr, the dating app, to co-design changes such as discrete app icons and a self-destruct PIN. These features allowed people to change their app icon to look like a calendar or calculator, providing an extra layer of protection during device searches by police in countries that criminalise LGBTQ identities. Rigot notes that this feature was created solely on the basis of these ‘edge cases’, but it proved to be so popular globally with Grindr’s users that it went from only being available in high-risk countries, to now being available internationally for free for all users. Rigot’s full paper is really worth a read as it is packed with other case studies like this.
We also discussed the Abuse Types defined by the Trust & Safety Professional Association. Many of the lawyers in the group were interested to tease out which of these terms are legal terms, and which are more moral judgements that are harder to define in law. This kind of debate has come up in the drafting process for the UK Online Safety Act, which has tried to address legal but harmful content, as well as illegal content. Defining what is legal yet harmful is essentially a moral decision, although there may be instances in which content which is not illegal as a category, could still have consequences that breach human rights laws. But whether or not that is the case in specific contexts inevitably leads to the lawyerly conclusion that ‘it depends…”.
Finally, we discussed the podcast from Data & Society on the origins of trust and safety. Important points were made in this podcast about the need for more diversity in trust & safety teams, including far better representation of African American and Latino people. Our Community of Practice would like to see this idea of diversity also stretch further outside of the United States, and to actively look to represent people from diverse geographies around the world that are equally as impacted by American tech, and may often represent the sharpest ‘edge cases’ that Rigot refers to above.
Our next Trust & Safety curriculum session will be on February 20th when we discuss Government Regulations. In the meantime, next week’s community of practice will look at the topic of age assurance. We will consider how to address emerging laws, regulations, and standards from a human rights perspective, especially with an eye on considerations for the deployment of age assurance tools in the Majority World.