Helping small organisations share their diversity stats

Abbey Kos
Doteveryone
Published in
5 min readNov 21, 2017

Reporting and sharing diversity stats can be a challenge for small organisations, in part because most reporting guides and tools are designed to be implemented at scale. The mismatch between tools and teams can lead to skewed stats (for example, a company of 500 with 25% BME staff is very different from a company of four with 25% BME staff) and a lack of anonymity (for example, asking people to share their LGBT identity may feel less safe in a group of 10 than a group of 1,000).

Our adviser Russell Davies came to us and our friends at Buckley Williams to help him create a diversity reporting tool to use at BETC, the agency where he serves as Chief Strategic Officer. He wanted to build something simple, open, easy to use and built specifically for smaller organisations.

Russell’s written a post on B-Side about the final product, as well as how BETC’s staff thinks about their background. This post is about the research behind the work — what we learned from asking people about sharing their diverse lived experiences.

We’ll be publishing the tool for open use in the new year. In the meantime, if you’d like to be one of our testers please get in touch.

Different questions have different sensitivities

Organisations need their staff’s consent to publish personal data. However, consent is usually obtained via a single yes/no question at the end of a form.

We decided to test a format where users could choose per question whether or not they wanted their answers public. It added extra work to the survey, but it also helped us see which areas were more sensitive than others.

Our results showed there was a difference: people were most willing to consent to information about their educational background and child care being shared publicly, and least willing to do so with information about their long-term health conditions and faith.

Users noticed and appreciated the chance to give consent on an issue-by-issue basis:

“The repetition of the question about sharing my data publicly after each section felt a bit clunky but also thoughtful. I liked having that level of choice about what was shared and with whom.”

If you’re asking your team about their background and experiences, work in a lightweight way for people to opt in and out of consent throughout. For example, an opt-out checkbox might be less clunky than the yes/no format we tested.

People see their diversity holistically

Diversity tools tend to be orthogonal — they report each question as independent. For our prototype, we provided an open field where people could list they ways they thought of themselves.

“I’m a Survivor, Irish, Male, Gay.”

“I’m a Global, Asian, Feminist, Woman.”

“I’m a Parent, Musician, African, Male.”

This was interesting both because it let people share a fuller picture of themselves and also gave us some insight into how they navigate their own intersectionality. There weren’t a lot of trends in how people “prioritised” themselves — “traditional” metrics of diversity, like ethnic background, gender or sexual orientation, weren’t more or less likely to be mentioned before terms like “entrepreneur” or “feminist”.

Our users were glad we asked these questions, but also wanted us to recognise they weren’t simple ones.

“‘Life experiences’ and ‘identities’ are quite nebulous — without writing an essay for these answers it’s pretty hard to say in a survey.”

If you’re designing your own tool for diversity research and reporting, giving people an open field to identify themselves holistically may give you a fuller sense of how your team thinks about themselves in a joined-up way. But be mindful of the line between important questions and existential questions.

Asking makes an impact

Everything we do has an impact on others. Sending out a diversity survey to a team has the potential to change, even if just subtly, the way your team thinks about you, your organisation or even themselves.

We asked people for feedback on our survey, and in addition to the standard “I liked it” / “I didn’t like it” types of answers, we also had a number of responses about how it affected people individually.

Some felt validated:

It felt freeing to know that someone made questions for what seems like daily micro aggressions we have been gaslighted to believe aren’t there.

Some reflected on their lives in new ways:

For me, I was struck that I had more difficulties growing up, as a perceived outsider in my small town (still with absolute privilege, natch) than I have had as an adult.

And some talked about the challenges of what it’s like to be a member of marginalised communities in the workplace.

I find the biggest challenge is raising the topic of my marginalisation in a way which doesn’t offend co-workers/collaborators or inhibit my ability to do well/ be perceived well. As a marginalised person you often police your otherness.

If you’re building your own diversity tool, don’t forget the impact — both good and bad — it might have on others. Give people space to feed back their thoughts, and design questions with care.

You can always go further

One of the most interesting parts of user feedback is insight into what could’ve been done differently. We took great care to design a well-made, low-impact and respectful tool, and we accomplished that — people shared no major issues with the product or its execution overall.

What we did get a lot of were suggestions about ways to expand our thinking and minimise our assumptions. Users recommended questions about neurodiversity, being raised by a single parent, perceived attractiveness and more. We saw this as a good sign: people understood the point of our tool was to elicit broader conversations about what makes us unique.

Users also challenged us on some of the ways we presented questions. For instance, our questions about religiosity were designed to gauge if people felt discriminated against because of their faith; however, in doing so, we didn’t address people who have no faith tradition at all. This kind of feedback is invaluable for making things better.

If you’re making a tool to gauge your team’s diversity, remember that — just like any other product — it’s never finished. Collect user feedback for your 2.0 and beyond.

Although this work was led by Cassie Robinson, who heads our Digital Society work, this prototype sits under Responsible Tech. We think diversity is key to responsible production and maintenance of technology: the more diverse your teams, the more relevant, useful and inclusive your products and services will be.

We’ve developed this as a specific answer to a specific question (i.e., how might we create a lightweight tool that allows small organisations to gain a holistic sense of their teams’ diversity). We’re not at all saying this should take the place of other diversity surveys, nor are we ignoring the fact that the “standard” categories gauged by such tools (e.g., race, gender, sexuality) are often the ones associated with the most obvious and long-term systemic bias.

We’ll be using it ourselves to find out more about who we are, and to make sure we’re walking our talk.

If you have any questions, feel free to reach out to me personally or send a note to hello@doteveryone.org.uk. We’re always happy to talk about what we’ve been doing.

--

--

Abbey Kos
Doteveryone

Writer, editor, strategist, fangirl. Trumbull County Fair Spelling Bee winner, 1994.