Structured Engineering Hiring at Thumbtack

By: Xing Chen

Image for post
Image for post
Photo by Alain Pham on Unsplash

When I joined Thumbtack nearly three years ago, we had 25 engineers and were just beginning to scale the team (to nearly 140 engineers today!). Needless to say, we had to hire fast to scale. As we did, we quickly started noticing some problems in our interviewing process.

Hiring fast meant we had to train new interviewers fast-sometimes within weeks of starting the job. It became harder to get everyone to care about the same thing in debriefs. One interviewer would claim that “the candidate was a great coder; very methodical”, while another interviewer would say “too slow and didn’t know about X library”. Who was right?

We also noticed that evaluation criteria like “culture fit” and “communication” were used in inconsistent ways. If a candidate did a poor job describing their last project, but did a great job discussing a systems design, were they a poor communicator? If a candidate had strong opinions and their pushback rubbed an interviewer the wrong way, did that make them a bad culture fit? We were risking making decisions based on cultural norms — hiring people like ourselves, instead of great engineers who would help Thumbtack succeed.

These ambiguous evaluation criteria were a recipe for letting personal bias creep in (whether it was conscious or unconscious). We realized that if we wanted to hire the best possible team, we needed to make more principled, consistent decisions to mitigate bias.

Since then, we’ve spent a lot of time and effort refining our interviewing criteria. We developed a set of interviewing principles and evaluation criteria that we reference when making hiring decisions for software engineers. Below, we share these guidelines with the goal of helping everyone understand what we are looking for.

Our efforts on structured interviewing have helped us move to more consistent hiring decisions over time. That said, we’re certainly not done, and continue to iterate on our interview calibration regularly — look forward to more updates on this in the future!

Eng Interviewing and Debrief Guidelines

The goal of these guidelines is to ensure that the engineering hiring process is effective in finding qualified candidates in a fair and unbiased way. To accomplish this, we use a calibrated and consistent approach to evaluating candidates during interview feedback and debriefs.

Interview Feedback Guidelines

  • Write up feedback immediately following the interview.
  • It’s imperative that you don’t speak to anyone on the panel before you write up your feedback. Each interview should be an evaluation of the candidate’s ability that is unbiased by other interviewers’ feedback.
  • Please add a summary to the top of your feedback, which contains an overall yes/no and some details on how the candidate performed in each of the pillars below.
  • Always make a decision and justify that decision. If you are unsure, write which way you are leaning as a “weak yes/no”, and justify why.
  • Feedback should stand on its own, even if you miss the debrief.

Debrief Guidelines

The goal here is to make sure we are debriefing candidates in a consistent way. [At Thumbtack, we do a 30 minute debrief to review the interview results for each onsite candidate].

  • Everyone should read all feedback before entering the debrief room.
  • When sharing your feedback in the debrief, don’t just repeat your written feedback. Briefly summarize your feedback based on the four pillars below (coding, problem solving, technical communication, learning). Explain whether the candidate was strong or weak, with examples, in each pillar. Try to spend 2–3 minutes per person summarizing feedback.
  • The hiring manager will then moderate a discussion on the candidate. Their goal is to align everyone on calibration and ultimately make a hiring recommendation based on all of the feedback.
  • Each person should discuss whether they think the candidate meets the criteria outlined below. Use your written feedback as a reference, and try to avoid groupthink.
  • If a decision really cannot be reached within the debrief’s allotted time, the candidate’s packet will be brought to a broader group of hiring managers, who will then make a decision. This option should only be used as a last resort, ideally less than 5% of cases.

Candidate Evaluation (One Pager)

This document summarizes our criteria for evaluating a SWE candidate.

Problem Solving

Methodically breaks down problems: Breaks problems into smaller pieces. Thinks ahead. Develops clear, sensible solutions and evaluates tradeoffs. Doesn’t lose track of where they are in the problem.
Tenacious / ability to dig deeper: Persists in the face of hard problems. Doesn’t get stuck, can dig deeper technically to make progress.

Coding

Writes clear, working code: Writes well designed code that works. Code is free of obvious bugs. Checks work carefully for errors.
Algorithms & data structures: Solves algorithms problems with minimal help. Understands core data structures.
Complexity analysis: Accurate analysis of code runtime performance and/or memory space.
Testing & debugging: Finds bugs efficiently and independently. Uses robust test cases and examples.
Language expertise: Fluent in at least one programming language.

Technical Communication

Explains complex technical ideas succinctly and precisely.
Listens well and understands technical explanations accurately. Asks and answers questions clearly.
Clearly states assumptions: clearly states what they do and do not know.

Learning

Receives feedback well: Appreciates and internalizes feedback without becoming defensive and without the need for repetition. Picks up on hints quickly, and is easy to guide when guidance is needed.
Easy to teach: Can quickly learn something new. Uses documentation, the interviewer, or other resources to expeditiously find and apply examples.
Evidence of past learning: Clear examples of learning from past experience. Lessons learned are insightful. Internalizes areas for improvement to ask “What could I do better?”

Bonus “good signs” (additive only)

These are things that we don’t expect or require from a candidate. If a candidate is missing these criteria, it should not count against them. But meeting these criteria is a bonus.

Expertise in a technical area: Distributed systems design, machine learning, web frontend, Android, iOS.
Strong product sense: Strong focus and understanding on the user or client, good sense of product design.
Passionate about Thumbtack: Interested in the mission, company, or our technical problems. Asks good questions.
Verifiable track record / has #GO: got things done efficiently in past projects.

Originally published at https://engineering.thumbtack.com on February 22, 2018.

Thumbtack Engineering

From the Engineering team at Thumbtack

Thumbtack Engineering

Written by

We're the builders behind Thumbtack - an online marketplace that matches customers with local professionals to accomplish their projects.

Thumbtack Engineering

Stories from the Engineering team at Thumbtack

Thumbtack Engineering

Written by

We're the builders behind Thumbtack - an online marketplace that matches customers with local professionals to accomplish their projects.

Thumbtack Engineering

Stories from the Engineering team at Thumbtack

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store