10 Lessons I Learned as a Software Engineering Interviewer

Interviewing is notoriously a painful and imperfect process. Over the past year, alongside my engineering work, I helped make a series of improvements to the hiring process at Samsara. Along the way, I discovered a ton about streamlining and standardizing the candidate experience.

From the perspective of an engineering interviewer, here is what I learned.

Before interviewing any candidates, make sure that specific headcount goals are established. A concrete hiring plan includes the specific positions that need to be filled, including the number of developers for each type of position: design, frontend product, backend product, infrastructure, data science, etc. After identifying the positions that need to be filled, publish job postings that include a description of the company, the role, the technology stack, position requirements, and sample projects that people in each position have completed.

Hiring goals shape the recruiting pipeline. For example, if you are hiring for one infrastructure position and five frontend positions, then you should be interviewing around five times more frontend developers than infrastructure developers. Concrete hiring goals will also help you predict a target number of new grad or intern hires, who typically need to commit to a job 9–12 months in advance.

Planning several months ahead will indicate how large the recruiting team should be. Without the right support, you will have difficulty sourcing enough qualified candidates and closing candidates that get to the offer stage. For instance, if you are aiming for five hires in the next few months, the recruiting team needs to be staffed to source outbound leads, filter through hundreds of resumes, organize the logistics for around 20–30 on-sites, and deliver 10–12 offers. Not having the necessary recruiting team will result in candidates falling through the cracks or being frustrated by a lack of communication during their interview process.

A front-end candidate should not be tested using the same set of questions given to an infrastructure candidate. The job descriptions are different, so the sets of required skills are different.

For example, a front-end candidate should be handy with JavaScript, familiar with promises, and able to manipulate the DOM with ease. A firmware candidate should be familiar with low level systems and the constraints of low-memory devices.

There is some overlap; all candidates should be able to translate logic into code. Evaluating skills relevant to each specific job position, however, requires some job-specific questions.

Algorithmic questions get a bad rap, because they are often irrelevant to anything an engineer would ever be doing. However, there are a class of algorithmic questions that are relevant to the job: those already implemented in your codebase. If a candidate cannot come up with an algorithm that your team needed to implement, they do not have the algorithmic ability to excel at the company. Pick the most difficult algorithm in your company’s code, and do not ask an algorithms question more challenging than that.

Moreover, diversify the interviews by asking questions that cover a wider array of skills. Some interview ideas include:

  • Bug bash: find a patched bug in a small open source project, remove the patch, and ask the candidate to re-implement the fix
  • Feature addition: ask the candidate to add a small feature to an open-source codebase
  • Systems design: ask the candidate to design a system that addresses a real-world problem

Giving a candidate three of the same type of question will give you three of the same data point. Diversifying the question panel gives the candidate a chance to shine in multiple areas, and gives you a better picture of their skills.

While new grad and intern candidates will have less experience with debugging and working within large codebases, the difference can be made up with increased support from the interviewer. The interviewer can gauge a new grad’s ability to receive feedback from more experienced peers by providing stronger guidance on the same questions and observing whether the candidate adapts to instruction.

Many interviewers ask single-part questions, resulting in 30-45 minutes of increasingly nervous progress towards a single solution. If the candidate finishes the question, they pass. If not, they fail.

At best, this is a binary evaluation metric — 1 for success, 0 for failure. Other important aspects of software engineering, such as testing and code cleanliness often take a back seat to the single monolithic pass/fail mark.

If the candidate fails, they usually know it. Interviewers are all too familiar of the awkward interruption that goes, “I’m sorry, but we’re out of time. What would you have done given an extra 10 minutes?” The candidate knows they have failed, and this failure will drift into every interview that comes afterwards. I have seen candidates who fail the first interview, and are so distraught that they seem distracted for the rest of the day.

With multi-part questions, the candidate does not know how many parts there are in the question. They could solidly finish two parts and be unaware that there were a total of four. If a candidate struggles on one part, they have plenty of room to shine later on in the question. Even if progress is slow, they can demonstrate thorough test coverage or impeccable code cleanliness.

Furthermore, the bar for passing a multi-part question can be easily adapted to new grad and intern candidates. If experienced candidates are required to pass all three parts of the question in 40 minutes, then an appropriate new grad bar might be finishing all parts in 50 minutes. The analogous intern bar could be finishing two out of the three parts in 40 minutes.

No one codes on a whiteboard. Implementing a solution on your own computer is far more predictive of ability than whiteboarding. By observing how a candidate performs in a familiar environment, interviewers can evaluate how comfortable the candidate is with common day-to-day tasks, such as looking up documentation and running code.

Every interview question should be accompanied with a lengthy description of what constitutes a “pass.” The passing bar can vary by experience level, as long as the specifics are documented. The potential impact of criteria such as test coverage and code cleanliness should also be included in the description of the bar.

Each interviewer should have the same answers when asked about the following edge cases:

  • A candidate finishes the question comfortably, but the candidate’s code is a mess, with copy pasted code everywhere. Does the candidate pass?
  • A candidate writes impeccable code and thorough tests, but is only halfway through the last part of the question at the time limit. Does the candidate pass? Does the candidate get extra time?

Bias leaks into interviews when edge cases are left up to interviewers’ best judgment. I have seen some candidates given an extra 20 minutes to finish up and awarded a “weak yes.” I have seen other candidates cut off at the time limit and failed with ~5 minutes of work left. Either scenario is perfectly acceptable; what is not acceptable is a lack of standardization.

Don’t tire out a candidate by letting each interview run over. If an interviewer is more than five minutes over time, the next interviewer should cut off the previous interviewer. Not doing so is disrespectful to the candidate’s time.

Moreover, when telling the candidate their schedule for the day, give a time range, rather than a concrete number of technical interviews. If the candidate performs poorly on the first two interviews, someone from the recruiting team should early exit the candidate, out of respect of their and the engineering team’s time. The candidate should not notice that their interview schedule has been shortened.

Before the on-site, a candidate’s interview team should coordinate the questions each person will ask the candidate. Throughout the day, the team should stay in communication about any changes to the schedule, such as lunch delays or early exits.

Within 24 hours of the on-site, the interview team should sync up about feedback and make a decision on extending an offer. Syncing up quickly helps keep the memory of the interviews fresh and allows the recruiting team to deliver an answer to the candidate quickly, within 24–48 hours. Waiting a week to get back to a candidate can make a bad impression.

If the team decides to make an offer, select one developer from the candidate’s interview panel to personally reach out and offer to chat about the opportunity. The person should be someone with which the candidate would work closely. Recruiters know to be “high touch” when closing candidates; the interviewer can help out by providing themselves as a resource through which the candidate can learn more about the engineering team.

Even if a candidate aces every question, categorically reject people who repeatedly interrupt and are rude to interviewers. One of the most important parts of every engineer’s job is interfacing well with others. A candidate should be able to adapt to feedback and explain their mental model of the solution in a way that others can comprehend. At the very least, candidates should recognize when to ask questions of their interviewers and understand an appropriate level of design pushback. If a candidate refuses to listen to a subset of their peers, they will not function well as a member of your engineering team.

Not every qualified candidate is right for your company. For example, if a candidate is looking for a machine learning position, and your company does not do machine learning, the candidate is not a fit regardless of engineering skill. If a candidate wants to work in NYC due to family commitments, and your company is in San Francisco, they are not a fit. Pushing a candidate who is not a fit to finish out the interview process wastes both their and your engineering team’s time.

Even if you reject a candidate, they can still be evangelists for your company. I have been passed referrals from both candidates that we have rejected and candidates who have rejected us. If a company is not the best fit for someone, but still comes across as a great company, they will recommend it to their friends.

Regardless of the outcome, every candidate should walk away from their interviews thinking that they were treated with complete respect. Many companies hand candidates a swag bag as they leave an on-site, or send a small gift in the mail afterwards, independent of an offer or rejection.

Overall, interviews are a window through which your company’s engineering reputation is formed. Improving the process for candidates will help make a positive impression on outside observers and, as a result, make your hiring process more effective.

A thank you to Ankur Goyal, Richard Ni, John Holliman, Elisha Paul, Tsvi Tannin, and Christopher Sauer for edits.

Senior Engineering Manager @ Samsara