Applied
Applied
Mar 4, 2018 · 9 min read

Kate Glazebrook and Andrew Babbage

Every day we work closely with our users to make Applied the best product it can be.

But there’s nothing like being a client of your own product to make you realise its shortcomings.

We recently hired the exceptional Andy Babbage into the role of Head of Growth. Here is what we learnt using our own platform to hire him — from both the hiring manager (Kate) and the candidate (Andy).

Kate’s perspective as hiring manager

How we structured the hiring process

Evidence shows that the more you can test candidates on what they’re actually going to do in the day job, the more likely it is that you’ll find someone who will thrive in it. Testing using job preview, or ‘work sample’ assessments is a core aspect of the Applied platform, so it should be no great surprise that that’s how we assessed the 60+ candidates who applied.

Stage 1: Online application through Applied

We ditched the CV, and instead asked candidates to answer four questions (max 250 word responses) that captured the key challenges of the job:

I: Why do you want to join the Applied team? Why now?

(Motivation, passion)

II: Pick two competitor products in the applicant tracking system market and tell us what we can learn from them.

(Knowledge of the market, strategic thinking)

III: It’s been a busy week and it’s now late on Friday. You have five things that you’ve yet to get to this week.
1. The UK government has just released a large procurement opportunity for a tool just like Applied. If you want Applied to be considered you’ll need to submit an initial proposal by the end of the week.
2. An investor has asked for a detailed update on the sales pipeline by the end of the week.
3. A client has just emailed saying they’re having trouble with a role that’s due to go live today and they need your help with a question.
4. You’ve not yet quantified your team’s progress against team goals for this month or set the priorities for the coming month. The meeting is on Monday morning.
5. A journalist would like to write an article showcasing one of Applied’s clients, but you need to get them the material today if you want to meet the publication deadline.
Imagine you only have time to do two of these tasks, which two do you choose and why?

(Prioritisation, judgement)

IV: You and the team are leaving a pitch meeting with a prospective investor. You think she took well to the idea but to get her over the line, she’d like a bit more detail on the scale of the market opportunity and Applied’s unique selling points. Write the follow up email you’d send her on behalf of the team.

(Persuasiveness, business acumen, communication skills)

Boy were their answers illuminating. We had scoring guides for each question which helped us to stay consistent, but we also explicitly got reviewers to prioritise answers that made us think differently.

Each candidate was blind scored by people in the team simultaneously using our debiased review method. As you can see below, it seems like the motivation question was slightly more divisive than the others: we disagreed on our scores by an average of 0.54 points on a 5 point scale. And that while the investor email question was the toughest overall, we tended to agree more on what good looked like.^

^ While I wasn’t deliberately trying to be political with my reference to a female investor, it was notable that some people unthinkingly wrote their mock response to a made up male name.

We track how written and interview questions fared. Are they too easy? Too hard? Too subjective?

Stage 2: Product interview

The highest scoring candidates then went into a structured, scored, in-person product interview with Dan and Diana in our dev team. Working on a real product challenge allowed us to test how they assessed and communicated commercial opportunities in the context of product design. And how empathetic and curious they about to the challenges of building product as opposed just to selling it.

Stage 3: Sales interview

Next was a mock sales meeting with some of our investors posing as potential clients. We gave the candidates some background material, but otherwise let them approach it in whatever way they liked. This tested sales proficiency and rapport, but also dealing with objections and structure.

We were particularly conscious in this meeting to leave time for the candidates to ask questions of our investors: to allow them the opportunity to probe things they were curious about and to learn more about the ethos of the company as seen from people who’d put their money behind it.

Stage 4: Reverse interview

Finally, for those still in the running we flipped it; giving the candidates the floor to ask Rich and I whatever they wanted. While we were interested (and scored them) on the types of questions they came to us with, we used this final stage to find out who was really hungry for the challenge. Being in a small but fast growing team isn’t for everyone, and they needed to choose us as much as we needed to choose them.

What worked

1. Job preview assessments removed gut-driven decisions and sped up onboarding

Ultimately, we collected over 1,000 independent data points from 9 different assessors on candidates’ skills in most important aspects of the job. Any successful hiring process will yield multiple people who could do the job well, maybe even really well: and we certainly had that luxury. But each will bring a different bent to it, and having detailed, unbiased scores to play with made making those tough trade offs more data-driven and less gut-driven. This was particularly helpful given some of the candidates we knew previously.

What it’s also meant is that Andy started on day one having not only seen and experienced most parts of the job already, but several weeks into his onboarding already.

2. Candidates valued the feedback

Those that know us know we’re passionate about giving every candidate feedback on their application, and no more was this the case for the 60 people that we didn’t hire (yet). We were keen to make sure we could offer candidates as much data as we could about why the fit wasn’t quite right yet, and it yielded the dividends you’d expect from a process that’s a bit more person-centric. And since we were lucky enough to attract an incredibly strong field of candidates, many of whom we’d like to work with in the future, this was crucial.

In one candidate’s words:

I found the process to be really engaging and refreshingly different from any other job application I’ve completed. What — no CV AT ALL?? The questions were fascinating, and completing them gave me a strong feel for the role. I’d definitely recommend Applied and will be doing so to everyone that will listen. I would also love a job board of live vacancies using Applied, as I feel that an employer that has decided to use Applied would be more attractive to me.

What didn’t work

1. We’d’ve liked to see more diversity in our applicant pool

There’s no shying away from it, while we certainly attracted high calibre candidates from a wide array of educational and professional backgrounds, we could have done better on certain measures of sociodemographic diversity. We did all the things we had in the platform — from even slightly skewing our job description toward feminine coded language, to checking our questions were inclusive, to posting on diverse job boards.

How our job description was coded

We did relatively well on socio-economic and socio-educational advantage and age diversity, and OK on ethnicity, but we did pretty poorly on gender and disability. Just 25% of all our applicants were women, and only 3% declared a disability.

Our response: a new feature to measure where great, diverse candidates can be found

What it did help us to do was accelerate the build of a new feature in the platform that measures where candidates are being referred from and to link that to socio-demographics and performance. Finally allowing us — and our clients — to identify the sources that drive the happy overlap in the Venn diagram of high performing and diverse candidates.

It helped us to identify that at least for this job, LinkedIn got us some fabulous candidates but they skewed male; Guardian jobs was helpful on gender but had low volumes; and Indeed and Angelist wasn’t a great fit for us on either metric.

2. Ambiguity aversion is real, and I underestimated it

Behavioural scientists know that ambiguity is a killer. People hate not having at least a few parameters they can control or predict. As a candidate, that can be as simple as knowing when you will hear back about the next stage of the process, what the next step with involve, and even the basics like how to navigate most easily to the office.

I tried hard to communicate with our candidates, but in feedback sessions a number (tactfully!) told me I could have done much more on this. Simply trying to be warm and welcoming (ensuring there was refreshments and some blood sugar level support in all meetings) were insufficient if people weren’t sure where they were going to end up.

Our response: defaults and reminders

Some of this comes down to the personal touch, but we’re exploring ways to help hiring teams in this. Down to what the default emails to candidates are about the interview, to how we write the copy that interviewers read when they commence a structured interview. In part, it’s as simple as how you explain to a candidate why collecting a little bit of data throughout an interview helps you to make better decisions, even if it can feel a little unusual at first.

Andy’s perspective as a candidate

I was on the receiving end of the Applied recruiting process, which was a rigorous combination of work sample questions, job simulations and reverse interviews. To say it was different from my other recruiting experiences would be the understatement of the year. It was largely much better and completely refreshing, with a dose of uncomfortableness thrown into the mix. Here’s what I thought worked and what needs to be improved:

What worked

1. No CV or cover letter

I was definitely uncomfortable at first applying for a job without using my trusty CV. This is a document that I’ve spent countless hours crafting, tweaking and/or completely overhauling — right down to the spacings between bullet points and placement of full stops, which were all strategically placed to give me the edge over other candidates.

I was both dismayed and delighted by the idea that the few guaranteed door-openers on my CV would not even be looked at — but then I thought, why should they be? It was refreshing thought that I would be judged on how well I handled the work sample questions alone. Plus I wouldn’t need to write a cover letter which can be as painfully boring to write as they are to read.

2. Work sample questions and job simulations

I really loved the work sample questions as they actually got me excited about doing the job. For each one I had to do some high level research, so I started to get a feel for what the role would involve and the industry and team I would be working with.

I love a good mix of day to day tactics and strategic thinking and I think the team really nailed the balance of the questions in this respect. I felt like I could really demonstrate my thought process for all questions and the short length ensured that they weren’t too onerous yet made sure that I made good use of each word and didn’t waffle. It’s the first job application I’ve done where the questions genuinely got me thinking.

What didn’t work

1. Ambiguity

The ambiguity was uncomfortable and there were at least 2 points in the process where I had convinced myself that I was out of the running and that my next call from Applied would be “thanks, but no thanks”. This is despite Kate’s diligent follow up emails, letting me know the exact next steps and timeframes. Despite all of that I was still trying to second-guess how I went in the last stage, was I being kept warm while another person got the job or had I turned them off so badly that they would never call me again. I think this is partly human nature, partly a hang-over from nightmare recruitment experiences and partly something that we can do even better on the Applied platform.

Overall, I actually enjoyed the Applied recruitment process, however I definitely felt discomfort at certain stages in the process. I ultimately think that the structure enabled me to showcase myself well on my own merits. It’s a testament to the Applied philosophy and platform that the team strictly adhered to the best research-based recruitment methodologies and did not cut corners at any point. The Applied team tells me they’re confident they picked the right person for the job… I guess only time will tell!

Andy Babbage is head of Growth & Partnerships at Applied, a SaaS platform that increases hiring precision and reduces bias. He’s a reformed engineer, biz strategy lover and enthusiastic amateur on gender.


Originally published at medium.com on March 4, 2018.

Finding Needles in Haystacks

The ongoing story of Applied, a team obsessed with using science to make workplaces fairer and more efficient by removing hiring bias and replacing it with things more predictive of potential. (photo: Shibuya Crossing © Joshua Damasio)

Applied

Written by

Applied

Using science to make recruitment smart, easy, and fair. www.beapplied.com

Finding Needles in Haystacks

The ongoing story of Applied, a team obsessed with using science to make workplaces fairer and more efficient by removing hiring bias and replacing it with things more predictive of potential. (photo: Shibuya Crossing © Joshua Damasio)

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade