Seeing the bigger picture

How we took steps to widen our understanding of social inequality to better design an inclusive product

Andy Craig
Mission Beyond
Published in
6 min readJun 14, 2021

--

Photo by Priscilla Du Preez on Unsplash

Working as part of the Mission Beyond initiative, the sheer size of the challenges we’re up against are never far from our consciousness, but the numbers involved can make it difficult to comprehend the human cost. We’re regularly confronted with percentages that can hide the individual stories that every single one of these people has to tell.

  • Early years — Children from low socioeconomic status families perform 11 months behind their peers from middle income families.
  • School years — At 16, only 24.7% of disadvantaged students get a good pass in English & Maths GCSE, compared with 49.9% of all other students.
  • Further education — Only 21% of students that receive free school meals are likely to study 3 ‘A’ levels compared to 47% of all other students.
  • Higher education — Only 26% of 15 year old state funded pupils who received free school meals went to university compared to 45% of all other pupils.

As a team who have been largely untouched by such issues, how can we ever begin to understand the everyday challenges faced by the young people we’re trying to reach?

Without wanting to make assumptions about each team member’s background, a quick glance around the screen during the first team meetings was enough to suggest that none of us could be considered part of our target audience, and it was one of the main risks flagged during our pre-mortem workshop. If we failed to engage with our key users or understand their motivations and experiences, we’d be giving ourselves little chance of success.

Matt’s already written about our approach to user testing, and to be absolutely clear, there is no better way to understand people’s perspectives than to provide a safe space for them to share it directly with you, asking questions and actively listening to their answers.

For us to better understand the context of these answers though, and open our eyes to the wider systemic issues that propagate the lack of opportunities our interviewees were talking about, we introduced a “wider understanding” session. For an hour each fortnight, we’d get together as a team to discuss our learnings from a relevant film, documentary or podcast that was suggested the previous week. Here’s what we focused on in each session:

  • Small Axe: Education (BBC): Part of Steve McQueen’s award winning anthology series, Education highlights the unofficial education policy of the 1970s that saw disproportionate numbers of black children transferred from mainstream education into schools for the “educationally subnormal”.
  • Coded Bias (Netflix): A documentary investigating the bias of algorithms after uncovering flaws in facial recognition technology, led by founder of the Algorithmic Justice League Joy Buolamwini.
  • The Social Mobility podcast (Guest: Nick Owen): An episode from Tunde Banjoko OBE’s podcast series, featuring a series of discussions with senior leaders about diversity, equity and inclusion.
  • Social Mobility and Inequality: A Dance With The Devil: A TED Talk from Dr Wanda Wyporska, who had a journey from being a young carer to CEO of a charity focusing on social and economic inequality.

Each session was loosely structured, allowing a few minutes at the beginning for everyone to post their thoughts on post-its before grouping into themes and allowing the conversation to flow from there.

In our first session, whether it was a result of the team not yet being familiar with each other or the comparatively sensitive topics of discussion (or most likely, a combination of the two), some of the team initially felt uncomfortable sharing their thoughts on emotionally-charged subjects with which they had no direct experience. Indeed, when discussing Education, some of the team weren’t aware that “educationally subnormal” schools existed, or that black children were systematically placed in them regardless of actual ability — not to mention the generational impact that these policies created.

As our team’s psychological safety increased over the weeks, the team began to grow the language and confidence to talk about each of the subjects in the group. Despite much of Coded Bias’ focal points taking place in America, we discussed the 2020 exam grades algorithm fiasco as an example from the UK, recognising the wider systemic issues that unfairly punish students from disadvantaged backgrounds while benefiting those attending private schools.

After each session, we started to see related links being shared in the team slack channel as we reflected on the subjects we’d covered, overcoming any discomfort as team members became more deeply engaged with the myriad of factors that contribute to societal inequality. And being able to rise above any feelings of unease when talking about social issues is vital if we are to be successful in contributing to positive systemic change, as a group of people largely unaffected by the system: Quite simply, our feelings will never outweigh the experiences of those directly affected, and should never take precedence.

By holding these fortnightly sessions, we aimed to raise awareness of the everyday discrimination and hidden barriers that affect young people from underrepresented backgrounds, and to decentre ourselves from our own worldview and recognise the privilege we may have benefitted from. I for one have never been routinely stopped by police, have never been told to work twice as hard as my school mates, and have never had to distrust the systems I exist within from punishing me without due reason. And I wouldn’t ever have realised I hadn’t been affected like this if it wasn’t for taking the time to see and hear the perspectives of people who have lived a different reality to my own.

To bring this back to our project, what impact has all this had on the product we’re creating? I feel it can be seen in every aspect of our prototype. All of us on the team are familiar with standard job sites, and it would have been easy for us to follow this well-trodden path. The collective work in taking onboard diverse perspectives though, and reflecting on how these sites may reflect our own privileges, helped us realise that replicating this experience simply wouldn’t be enough for our target user. This has in turn influenced our work — whether it’s initially helping our users discover employable skills they weren’t aware they had, or presenting job adverts in jargon-free language they’re more familiar with — all of the decisions made have had the aim of making our users feel safe, reassured and to build their confidence throughout.

It can be seen in the design detail too. One of the themes emerging from our discussion on Steve McQueen’s Education was the power of positive reinforcement for individuals who are used to having their confidence knocked down. Within our prototype, the skills that are pulled out of the user’s interaction with our “chatbot” are then explicitly carried through into other steps in the journey, acting as a subtle positive reminder of the value each user can bring.

Our user’s skills will be carried through on each stage of their journey through Open Doors

It also speaks to our project’s future. The biases at the heart of Coded Bias are often not deliberately built into AI systems, but instead arise by virtue of non-diverse data sets and thus serve to reinforce structures already in place. As a team, we consciously aimed for user testers with a variety of life experiences to try and combat this, but we need to be mindful of casting the net even more widely during future user testing recruitment.

Closer to home, engaging more widely with the subject matter has also emphasised the importance of our team being more reflective of the diversity of user we’re trying to reach, and is something that must come into consideration as the project progresses.

--

--