Improving Gender Diversity in Engineering
(Bob Raman, Tyler Anderson, Wai Chee Yau, Jeff Theobald, Stefan Vizzari, Adel Smee)
One of our goals at Zendesk has been to improve the gender diversity of our engineering teams. My team, Data Engineering, is reasonably ethnically diverse but we could improve our gender balance.
So when the opportunity came to fill a couple more roles over the last 6 months we decided to focus our hiring process on improving this imbalance.
We started our journey as a team that had a fairly traditional hiring process .i.e. one that was heavily weighted towards candidates having a computer science degree and an emphasis on testing mainly algorithms and data structure skills.
Knowing that we want to attract top talent and support the local software community, Zendesk Melbourne hosts a number of meetups. When hiring, we use these meetups as an opportunity to announce what positions we have open. While hiring for the data engineering team, we made an effort to target meetups like Girl Geek Dinner and Rails Girls that would help us improve our gender diversity.
We sourced sites like LinkedIn and GitHub for suitable candidates, again placing an emphasis on finding female engineers. We made sure to cross-reference a candidate’s Github account alongside any talks that they had given or blogs that they had written to best assess fit.
Once we had identified a list of candidates, we set up a mixed panel of largely female staff to help review. We wrote custom emails for each candidate, citing some of the reasons why we thought they would be a good fit and referencing some of their past work. We also included any blogs and talks that the Data Engineering team had done in the invitation email to give the candidate more context about the particular role. This strategy was one of the most effective ones that we used as a number of candidates that we reached out to circled back even if they were not ready to move right-away — more on this in the Learnings section.
One note about finding talent. Perhaps the single best source of quality candidates are your very own colleagues and engineers . At Zendesk over 50% of our hires come from internal referrals. Internal people are best able to assess whether a prospective candidate will fit the criteria that the team is interested in correcting, such as gender imbalance. To be sure we received some great recommendations from our internal engineers — one of the final hires was an internal recommendation.
Improving the Process
This section outlines the various activities that we did to improve our interview process.
Our first task was to improve the interviewing skills of our hiring panel and learn about under-represented groups. The interview panel consists of the engineering manager and 4–5 engineers from the Data Engineering team.
We started by reading best practice articles such as that from Google , as well as looking over a “Zen of Interviewing” guide put together by our internal recruiting team. What’s more, we looked at how other teams within Zendesk conduct hiring, and talked to both HR and Recruiting for ideas. We also discussed how our existing interview process could potentially disadvantage females .
One best practise, which came up repeatedly in what we read, was reducing bias in the interview process — especially “confirmation” and “similar to me” bias .
Finally, we agreed that gender diversity should be a goal, but not the only goal, in hiring for the Data Engineering team. In addition to diversity we also required that the candidate met the high standards we set for ourselves and for Zendesk.
To minimise personal biases  we started by writing a hiring guide.
The purpose of the hiring guide is to outline the interview process from start to finish, from the people on the interview panel to the number of interviews to what content should be addressed in each interview. The hiring guide is also where we define our technical, behavioural, and situational questions, identifying what would be a poor, good and excellent response beforehand . For technical questions in particular, we agreed that we would only ask those questions that we had already solved at work ourselves. Again all these practices help reduce bias when evaluating candidates.
Finally, the hiring guide is a good reference point for any new interview panel members, as well as recruiters and other people external to the team.
Next we reviewed the job description. We used tools like Textio  to check for gender neutrality. In fact, Textio gives you a better rating if the job description is weighted towards female candidates.
We also changed the focus to one where we explicitly checked whether “you can do the job”, rather than “do you have the right degree.”
We designed the interview process with couple of key things in mind:
- Get panel agreement on what outcomes and associated competencies we are hiring for.
- Only ask questions that we had solved at work. Our readings indicated that we should shy away from using design/architecture/algorithm questions from textbooks as we had done previously since these disadvantaged under-represented groups.
- Hire people with shared values. E.g. respectful, constantly learning, humble.
- Strive to hire someone who may be different to us in some way, that is to say, a new hire should strengthen, not simply supplement, our existing team.
- We view interviews as a two-way street — we interview the candidate and they interview us so the process has steps built in to allow interactions in a few different settings.
Finally we wanted to make sure that the candidate has a great experience in interviewing with our team. This meant not only being conscious of the time candidates would have to invest in tasks like take-home assignment but also how we treated them.
The Interview Itself
Our assessment criteria and interview process are outlined below together with things we do to minimise bias.
In coming up with our assessment criteria to objectively compare candidates we looked for both the technical and non-technical skills that would best help someone succeed in the data engineering team. These included:
- Skills and relevant experience for the role — be clear about the skills and expectations.
- Teamwork/Collaboration — the take home assignment and pairing onsite together with behavioural and situational questions help validate this.
- Communication skills — E.g. how well can the candidate explain problems that they have solved.
- Problem Solving — have a go at some of the problems that we have solved ourselves at work.
- Growth mindset  — Passion for stretching oneself and persisting even when it is not going well.
We grade the candidate in these categories rather than give them scores. E.g. Poor, Fair, Good, etc. Note however that we do not base selection of candidates exclusively on the above criteria but rather use it as a guide in discussing candidates.
In the end we settled on three rounds of interviews with the candidates bunched together so that it is easier for the interview panel to compare them. In all the rounds the interview panel is asked to write down their assessment shortly after the interview in a shared tool to minimise bias.
Round #1: Technical Phone Screen/Homework Assignment
For our first round we opt for a technical phone screen by the hiring manager, instead of a video link to reduce bias. Where we use tools like https://hired.com.au we choose to blank out the name of the candidate and only see the initials to reduce cultural and gender bias. A similar idea to https://hired.com.au with blind auditions has been used in the music industry to reduce bias against women .
If the candidate passes the technical screen we send them a homework assignment and ask that they return it within one week.
Round #2: Technical Onsite Interview
For the second round of interviews candidates are invited into Zendesk, where they are paired with two engineers and are asked to expand on the homework assignment completed in the first round. We encourage the candidate to bring their own laptop for this exercise. The idea here is to assess how well the candidate collaborates with other engineers, as well as the quality of their code, with a particular emphasis on writing good unit tests.
We also get a pair of engineers to take the candidate through solving a Data Engineering problem that we have solved in the past. To minimise bias we spend a good deal of time getting the candidate up to speed on the problem and the interviewers do regular reviews and practise the delivery of the problem. We also allow the candidate to use either paper, whiteboard or their laptop — whatever they are most comfortable using, to help minimise the stress on the candidate and allow them to perform at their best.
Both of the above parts also ask behavioural and situational questions that the team feels are valuable.
Round #3: Shared Values and Collaboration Interview
If all goes well in the first and second rounds the candidate is invited back for a third and final round, which focuses primarily on collaboration and shared values .
The candidate interviews with the other teams who he or she will work with on a daily basis. In Data Engineering, this means meeting with Data Scientists and Product Manager, who check for collaboration and the candidate’s passion for building data products respectively. The Head of Office will also meet the candidate to check how their values align with Zendesk’s own.
Finally we invite the candidate for a coffee with all of the remaining members of the Data Engineering team, including QA, UX and Ops, to see how the candidate functions at the personal level and also to give them a more relaxed space to interview us.
The last step is the selection process. To avoid one interviewer biasing another we prohibit rating a candidate until all interviewers are in one room. Panel members then simultaneously score a candidate and talk through their respective reasoning. A score of three for us means that the panel member is ambivalent about the candidate whilst a four or above is strong sign to hire. Anything below a three is usually a sign that the panel member has concerns. Often all panel members have to give at least a three for us to move to the offer stage, though there are exceptions to this rule as the final decision sits with the Hiring Manager.
In these discussions the interview panel members are encouraged to talk to the selection criteria in the hiring guide.
Some of the lessons that we learnt from the interview process include:
- Having females on the hiring panel is really helpful to identify gender bias. With a male-female interview pair you can check whether the candidate is only directing questions at male interviewers or ignoring/interrupting the female interviewers. However this approach is not failsafe. In some cases it may be necessary to set up subsequent interviews, such as one with a senior female interviewer paired with a junior male interviewer, to clear up any uncertainty.
- Do not outline the engineering level of the interviewers when pairing. We found that introducing one interviewer as Senior and one as Junior would often bias the candidate to direct questions only at the Senior interviewer.
- Discuss articles on bias with the hiring panel and highlight confirmation and similar-to-me bias.
- Be sincere and put in effort on checking out candidate’s contributions — blogs, talks, etc, People really appreciate if you do your homework on them when you reach out to them on LinkedIn . We received responses like the following from people: “At the moment I am not looking to change jobs, I’m having fun in my current role, but I wanted to thank you for the email, I have to say it was quite thoughtful and different from the template I usually get on LinkedIn.” At a minimum we have planted a seed where the candidate will hopefully think of Zendesk when they are next thinking of new challenges.
- When using LinkedIn to reach out to candidates there is no need to use private mode. We had a few candidates follow us and apply for positions because they saw an engineering manager looking at their profile.
The result was that we managed to get three senior female software engineers shortlisted and ended up making offers to two of them. One of the offers got accepted! The Data Engineering team Gender and Ethnicity balance is now as outlined below.
For now, gender diversity remains one of the biggest challenges faced by the software industry. Only by acknowledging its existence can we take the necessary steps towards creating workplaces that are balanced, fair, and inclusive. While that journey has been hard and full of lessons, we believe it is an important journey to take.
I would like to express my thanks to Wai Chee Yau, Erica Wass, Adel Smee, Tyler Anderson and Matt Woodard for helping with the LinkedIn based recruitment drive for female engineers.
I would also like to thank the interview panel — Sean Caffery, Jeff Theobald, Arvind Kunday, Wai Chee Yau, Damen Turnbull as well as our Data Scientists — Chris Hausler, Anh Dinh, Arwen Griffioen, Soon-ee Cheah and our Product Owner Mike Mortimer for their help with the recruitment process.
Special mention also to Eric Pak, Derrick Cheng, Jonathan Belotti, Balaji Sekar, Chris Holman and Nicky Urban-Weiss.
 Carol Dweck, Mindset: The New Psychology of Success, 2007
 Who: A Method for Hiring, Geoff Smart and Randy Street, 2008