Implementing peer reviews in our CS team

Luis Hernandez
Geckoboard: Under The Hood
4 min readSep 12, 2019

We recently decided to introduce Customer Success peer reviews. This decision came as the answer to two different questions:

  1. How could we be more proactive with our customers to help them succeed with Geckoboard?
  2. How could we easily highlight to each other missed opportunities in our daily interactions with customers?

We had explored a number of different ways to crack these questions, including a certain number of “yes, and…” interactions as part of our OKR cycle. But despite knowing the rationale behind it, and getting everyone’s agreement that we should pursue it, we kept falling short. We were missing the right process and tools to really drive change forward.

Luis using Klaus

It got me thinking, why not extend our usage of Klaus to remind each other about the different things we know we could do with customer interactions to increase product engagement?

We’d already been using Klaus to help new joiners. The tool allows us to review interactions between new joiners and customers. Analysing these first interactions helps us ensure the new joiner has understood what they have been asked for, that their answers are accurate, and the tone of voice follows our standards, etc.

Working on this idea further I realised that there would be some further advantages of increasing knowledge-sharing using peer feedback. Team members in the US, for example, who aren’t as familiar with GDPR-related questions as team members in Europe. Peer feedback offered a way to unlock knowledge.

Similarly, many team members have discovered or created workarounds that aren’t written down anywhere. Reviewing others would allow the team to compare notes, and document those hidden gems of information.

How we implemented peer reviews

I didn’t want to implement unilateral feedback and be the sole reviewer, as I wanted to avoid making the team feel I was breathing down their necks. I felt it was important that we choose the exact “flavour” of what to review carefully to make sure it would be an effective process.

We decided to start with two peer reviews per day as we normally have the bandwidth to spend 30 or so minutes a day on this kind of project. We set a few basic rules and created a scorecard with the six things we wanted to improve:

Our peer review criteria

To be able to rate an interaction, every team member would need to be familiar with all the elements used in the scorecard. This also means that each team member having a conversation with a customer would need to keep these key elements in mind.

As you’d expect, we’ve had some initial teething problems. For example, our main task is to support customers, so peer reviews only happen when we can truly book some time away from the queues. If a team member is away on a particular day and other team members absorb the extra load, then they might not have the time to peer review on that particular day.

It has also transpired that the team tends to be very positive. Pointing out areas of improvement doesn’t come as naturally to them. This isn’t surprising, as our lexicon encourages team members to use positive language and a similar approach to the one they use when talking to customers. However, only celebrating the positives while reviewing a colleague’s work can lead to what Kim Scott refers to as “ruinous empathy” in Radical Candor.

Results so far

It’s been six weeks since we implemented peer reviews for support and I have to say the results have surpassed our greatest expectations.

By empowering the team to peer-review, the process is entirely scaleable, creating reviews that are a more representative sample of our customer interactions.

We are now sharing more information internally and providing ten times as much proactive advice between team members than we did before.

And all the data points show us delivering excellent support with a great CSAT score and impressive response times.

What’s next?

We moved from two peer reviews a day to ten peer reviews a week as we continue to calibrate the process. This removes the pressure to review conversations every single day and lets people choose a particular day when they have time.

On top of that, I’ve been reviewing the reviews to help calibrate evaluations and ensure we make the most out of giving and receiving feedback. I want everyone to feel free to point out improvements without guilt or hesitation.

Peer reviews on Klaus have proven to be a true platform for development in our support team. Reviewers can learn from the interactions (i.e. new workarounds, requests they’ve never been exposed to, etc). We’ve improved our entire process compliance and the feedback is constantly available to everyone in the team.

--

--