How we interview

My friend Lara Hogan posted a template for setting up on-site interviews, and it occurred to me that other engineering managers might find the overall process my teams at Etsy use for conducting interviews helpful. When I set this up I managed a group of four teams composed of roughly 25 engineers, and we’ve scaled it up to an organization of roughly 50 or so engineers. I’ve edited it lightly for public consumption.

We designed our hiring process with the following goals in mind:

  • Hire great engineers — we want people who are really good at writing software, or who will be as they grow.
  • Hire great teammates — being a good programmer is not enough, we want to hire people who we love working with, who uphold Etsy values, and who help all of us get better at our jobs.
  • Reduce the impact of unconscious bias on the hiring process and continue to build a diverse team. We want to choose people based on the attributes that are relevant to their performance at work, and not those attributes that our biases trick us into valuing.
  • Expose the engineers on the team to the interview process. We want people to have influence over who’s on their team. Becoming a better interviewer and getting comfortable in the interview setting is an important career skill for engineers.
  • Reduce the workload on teams that are hiring — teams that have open positions to fill are often really busy, and requiring them to fill all the interview slots from within the team can put them in a tough spot. Pulling interviewers from a much larger team smooths things out.

Here’s a rough outline of how we do hiring.

Step One: Review Résumés

Generally the recruiter and hiring manager sift through the résumés to figure out who we should phone screen. Aside from relevant skills, we tend to value people who include a cover letter, and whose resume (or cover letter) give us some indication of who they are as a person and why we might like to work with them. It would be cool to have some kind of “blind” process for evaluating candidates at this stage, but we don’t have that yet.

Step Two: First Phone Screen

The hiring manager generally does the first phone screen, and this interview is mostly scripted (I may post my phone screen script at some later time). We generally don’t do any kind of coding exercise in the phone screen, and that probably goes back to my distaste for those kinds of phone interviews (both as an interviewer and a candidate).

Step Three: Optional Second Phone Screen

If a candidate seems like they’d be worth bringing in for an on site interview, the hiring manager will oftentimes ask me to do a second phone screen as a sanity check. I generally do an unscripted interview and if it goes well, we bring them in. The managers on the team have good interviewing skills, and the second phone screen almost always leads to an on-site interview.

We’re currently considering replacing the second phone screen with a service like HackerRank in order to get a stronger read on technical skills prior to the in-person interview without subjecting people to the pain of coding over the phone.

A Note on Homework

Currently we do not ask candidates to do homework as part of the process. I actually like homework because it provides the candidate with the chance to show what they can do outside the high-stress environment of the interview room. I don’t feel like coding under pressure accurately models what it’s like to do the work of a software engineer, and so letting someone code on their own time feels more natural. However, it’s a competitive market, and asking people to put in hours of free labor to get a job seems unappealing. I also feel like the people least likely to get the job often spend the most time doing the homework, and that feels unfair to me. Going back to HackerRank, some kind of time-boxed exercise done before the interview feels like a reasonable compromise.

Step Four: On Site Interviews

This part is the most complicated, so I’ll handle it further down.

Step Five: Decision

The goal of the in-person interviews is to collect as much data as we can about the candidate. The job of the hiring manager is to synthesize that data and make a hiring decision. So basically, the other interviewers have some influence over who gets hired, but the decision ultimately rests with the hiring manager.

On-Site Interviews

Our on-site interview process is highly structured, with an eye to reducing the impact of bias on our hiring decisions. We do five interviews (including lunch) and we include a shadower in every interview both to insure that our interviewing approach is sound and to on-board people into the interviewing process through observation.

The interviews are:

  • Lunch — generally this is with the hiring manager (and me). Maybe one day we’ll hire people that I’ve never even met but that still feels weird.
  • Hands on keyboard — some kind of programming exercise. Generally we do a relatively easy programming question or for more senior candidates, an interview that tests their ability to manipulate log files. The main goal here is to get a sense for how people solve problems, interact with their text editor, and so forth.
  • Values — this interview is to see how aligned the candidate’s values are with ours, and just as importantly, to get a sense of what they’re like to work with. We have a standard list of questions that we pull from.
  • Subject Matter Expertise — this is a technical but not hands on interview. For this interview, we use Jocelyn Goldfein’s Recipe for an Interview and ask about a topic where they have a lot of experience. The topic should probably be decided as part of the pre-huddle.
  • Stakeholder — We usually have someone outside the team who’s a stakeholder interview the candidate, just to make sure they can communicate in a clear and compelling way with people who don’t have exactly the same job they do. We tend to prefer someone who’s not an engineer for this slot.

Three of the interviews (values, subject matter expertise, and stakeholder) are behavioral-style interviews. Our recruiting department provides a list of behavioral questions on a number of topics that we use.

Choosing Panelists

We have a spreadsheet with entries for every person we interview, which interviews performed which interviews, and how they scored the candidates. The goal is to make sure everyone on the team is getting to participate in the process, and that people are learning how to perform a variety of interviews. We also use it to get a sense of how people on the team tend to score candidate (yes, some people just put “Inclined” for everyone).

The general approach is to let people shadow a particular interview a couple of times before performing it themselves while shadowed by someone more experienced.


We know that in some ways, shadowing can make the experience slightly more awkward for candidates, but the benefits outweigh that. For one thing, shadowing is the only real way to get useful feedback on how to become a better interviewer. It also gives us a way to onboard people to the interview process so that they can observe the process before interviewing people themselves.


For the first couple of on-site interviews for a new position, we tend to have in-person meetings to talk about the candidate and let the interviewers know which interview they’ll be doing. Then we tend to migrate to a preparation email. From now on though I’ll probably just use a version of the template Lara created.


Entering somewhat detailed feedback into our recruiting tool is really important. Ideally feedback will be entered with enough time for the hiring manager to review and synthesize it prior to the wrap-up meeting. Feedback should state which interview you gave and include detailed information about what was asked and how the candidate did. In terms of scoring, the rubric is roughly as follows:

  • Strong Hire — You would be a “yes” on hiring them based on the information from your interview alone
  • Inclined — You are a yes if other people’s interviews went well
  • Not Inclined — You are a no, but could be persuaded to switch to “yes” based on other people’s interviews
  • Strong No Hire — You are absolutely opposed to hiring the candidate


Usually we have a debrief meeting about the candidate the day after the interview. One important goal is to reduce the time between the interview and the offer (if the candidate is going to get an offer). Making people wait to hear back is really painful.

Traditionally, the debrief meeting consisted of people taking turns repeating their feedback on the candidate out loud, followed by some discussion in an effort to gain consensus on whether to make an offer or not. If it’s obvious there won’t be an offer, the debrief becomes a post-mortem on the interview process.

I prefer to have the hiring manager synthesize the feedback into broad areas of agreement and points to discuss. At the debrief, they present the areas of agreement and drive a discussion of the points to discuss, then ask if people have anything else to talk about. At that point, it’s usually obvious whether there will be an offer or not. If it’s not obvious, the hiring manager will make the call in the room or privately in consultation with their manager.

Creating a Positive Candidate Experience

The most important thing to remember about interviewing is that it is a window into our company for candidates. Assessing the potential of the candidates is important, but so too is making sure that the candidates have a great experience interviewing at Etsy. At the end of the interview, we want people to feel like their time was well spent, and we want them to want the job. Candidates also do a better job at the interview when they are relaxed and happy, so we want to do what we can to help them relax and be happy.

Explaining why Etsy is a great place to work is also part of everybody’s job during the interview. Obviously, they should spend plenty of time asking questions, but should also make sure to take some time to talk about their job, what’s good about it, and why the candidate might enjoy working here.

Open Questions

The document includes the following open questions that remain unresolved.

  • How might we make the “hands on keyboard” interview feel more like a sample of actual work the candidates will be doing?
  • Should we always require homework or a coding sample?
  • How might we do blind reviews of résumés?

Reference Material

Here are some references and reading material that underlie our approach: