Redesigning the homepage for one of Europe’s largest digital healthcare companies

A Design Sprint case study

Carl Worricker
ZAVA
7 min readOct 31, 2019

--

I organised and facilitated a design sprint to collaboratively redesign the homepage experience of the ZAVA website. This case study highlights the process and outputs of the Design Sprint, as well as giving some insights into the results of various methods of testing and analysis.

Me (in a captain’s hat) writing up some key themes during the Design Sprint

But first, some context…

ZAVA is one of the largest digital healthcare players in Europe. We are attempting to break down barriers to health and enabling people to do more of what matters to them by building healthcare that’s accessible, dependable, and at a fraction of today’s cost.

We were previously known as DrEd, and operate in several countries across Europe, most notably in the UK and Germany.

Also, the captain’s hat is a team tradition; we wear the captain’s hat whenever one of us facilitates a meeting. I don’t casually wear captain’s hats…

Welcome aboard 👨‍✈️

Why we Sprinted

A homepage has many roles. Those roles will vary depending on context.

For a homepage within healthcare, we know that there is a particular significance on comprehension and trust of the service.

As healthcare undergoes a digital revolution, we must be able to coherently explain our service — who we are, what we do and why our offering is safe. The way our patients interact with our doctors differs significantly from the offline experience they’re familiar with, and if we can’t convince them to trust our service then there’s little chance of them becoming a patient.

We decided to tackle this challenge with a redesign of the homepage.

The importance of the project along with a desire to move quickly led us to the decision to run the project as a Design Sprint.

The Structure of the Sprint

Rather than adopt a traditional Design Sprint structure, I decided to use the updated Design Sprint 2.0 approach.

What is the Design Sprint 2.0?
Design Sprint 2.0 is simply the most up-to-date, semi-official version of the Sprint, as of May 2018. One of the biggest differences between the original Design Sprint and the Design Sprint 2.0 is that 2.0 is optimised to work not just in startups, but also in large organisations that don’t necessarily have time to commit an entire week to the full process.
- Reference

There were two reasons behind this decision;

  1. We already had a brief and area of focus so I didn’t want to spend too much time on the journey maps. Instead, I wanted to put greater emphasis on expert interviews.
  2. Not everyone is needed for the entire Sprint, so I wanted to limit the effect on everyone’s day-to-day work.

We used a time timer throughout the first day to create a sense of urgency within the room and improve confidence in the process.

The Problem and The Solutions

The first day was split in half; the morning was for understanding the problem and gaining as much context as possible, the afternoon was for ideation and solutions.

The Morning

Firstly, I spent a small amount of time explaining what a Design Sprint was, why we were doing it, and what we were hoping to get out of it. This naturally led us into discussing and agreeing on the project goal:

To create an outstanding homepage experience

As I previously mentioned, we wanted to put much more emphasis on expert interviews during this Sprint. We spent two hours interviewing four experts, one each from UX, Research, Marketing, and SEO. Whilst the interviews were taking place, I asked everyone in the room to write How Might We statements on Post-it notes based on the expert’s responses. This allowed us to affinity map everyones Post-its in order to create key themes for the Sprint.

Whiteboard with a Post-it note affinity map, and key themes

The key themes for the Sprint were kept as How Might We’s;

How might we…

  • build trust
  • show how ZAVA works / what ZAVA does
  • show who ZAVA is through brand personality and achievements & profile
  • create a bangin’ first impression
  • demonstrate the value of being doctor-led
  • adapt to multiple user journeys

The Afternoon

Straight after lunch we went into Lightning Demos, where we all shared our opinions of outstanding homepage experiences, both on competitor and non competitor websites. Whilst the Lightning Demos were taking place, I would sketch out interesting features from the homepages onto a whiteboard for reference later.

We then did 4-part sketching , which is comprised of boot up note taking, ideation, crazy 8’s, and then solution sketching.

The Sprint team busy creating their solution sketches

Once solution sketching was finished, I placed all of the ideas up on the wall. Solutions were kept anonymous whilst I presented the ideas to the group. Everyone was then given unlimited dot stickers and was encouraged to vote as much as they wanted on individual features of each idea.

Me presenting everyone’s solution sketches before dot voting

At the end of the first day, we had a selection of dot-vote heat maps which helped us identify the features we felt were doing the best job at solving the previously defined key themes.

Co-designing

The next day, I worked with another designer to co-design hi-fidelity UI based on the highest voted features from the previous days Solution Sketches.

We designed in Sketch, and used Abstract so we could work individually, then periodically merge ideas back into a master file. We also used Principle for prototyping any examples of motion that were required.

The hi-fidelity UI created from the co-design session

Testing

We ran several tests in order to give us the best chance of having strong and contextual insights. We also aimed to gather a mix of qualitative and quantitative data.

A/B Test

We tested the new design in Optimizely against the original base design with the measurement of success being a 6% expected uplift in users starting the assessment journey. It is common practice in our squad to perform our A/B tests on mobile only as the large majority of our traffic is on mobile. This means that a developer will only need to create a mobile version in Optimizely which saves time and effort.

Heat maps

We ran heat map tests using HotJar on both the new design and the base design to gather insight into where people are tapping on the homepage, how far they are scrolling, and how that behaviour differs between the variations in design.

Service comprehension test

We also ran two service comprehension tests (one in the UK, and one in Germany) in the form of a survey on Helio, where we tested three variations of copy which attempts to coherently explain what ZAVA is and what ZAVA does.

During this test, we also asked the participants to comment on the look and feel of the new design in order to gain some qualitative feedback which could be used for a design iteration in the future.

The Results

A/B Test

Overall, there was no statistically significant change in primary or secondary metrics, but we did have several interesting learnings from the A/B test.

Line graph with labels removed shows the result of the A/B test

The base design (blue) had higher peaks and lower troughs than the variation (green), so the new design was converting more consistently throughout the week.

We had highlighted a few more of our breadth products, but slightly lost the emphasis from one of our core products, which highlighted itself as one of the biggest areas for improvement in the next iteration.

Heat maps

Through the HotJar heat maps, we saw a significant increase in engagement in the number of taps on the new design. We also saw some changes in user behaviour between the base and variation design.

One of the tap heat maps which allowed us to gain insights into user behaviour

We also saw that users were seeing much more of our new homepage design compared to the base design, the assumption of this being that users could be learning more about the service. Below you can see the scrolling heat maps;

Scrolling heat map for the base (left) and variation (right)

Service comprehension tests

We found some strong themes in the comprehension tests with winning variations of copy appearing in both locales (UK and Germany).

We also had some strong themes in the feedback on the look and feel of the new design which will be used to help steer the next co-design session.

What’s next?

The outcomes of these tests have given us clear direction for the next iteration of the design. We will be specifically focusing on regaining sales for our affected core product whilst maintaining the positive exposure gained for other products, and looking at how to implement the improved look and feel updates based on the visual design feedback.

--

--