Vodafone UK’s Web Testing Strategy

How Vodafone UK looked at unifying their front end testing strategy across a team of over 100 developers.

Sunny Patel
Vodafone UK Engineering
7 min readMay 28, 2021

--

Testing… the word that sends shivers down a developers’ spine. If your response to “Have you written test cases for this?” is nothing but a swear word, you know just how painful it can be! Testing is a crucial step in development as it gives developers full confidence that the code written fulfils its purpose, without affecting any other work in the project.

So… what’s the problem?

When you have a small team and only a handful of applications to manage, finding and sticking to your testing strategy is manageable. The challenges arise when your application is split into many repositories and micro-sites, all being handled by many developers, who each bring their own knowledge, thoughts, ideas and opinions to an already broadened topic. This is the challenge Vodafone UK has faced. Whilst the number of developers we have are growing exponentially, it brings a mighty challenge to try and grip onto one testing strategy across the board.

Every developer each has their own ideas on the best approach on testing their code; this can be from the very basics of naming conventions to more complex cases such as which framework to use (for example, Cypress, React Testing Library, Enzyme) or how to mock functions. The thoughts behind these decisions are seldom documented, proposing a challenge to others looking to set up new repositories, or to join a team where the technology stack has changed, as they are unsure on what to use and why.

So, the solution is to…?

We needed to unify our testing strategy; having that one source of truth which holds all the information - the decisions made, what technologies and libraries to use where and when, which teams are using what and the best practices for these tools.

We formed a new group called Radiators (massive shout out to our Agile Coach for the name!), consisting of members from different backgrounds and scrum teams; developers, UAT Automation Engineers and Chapter Leads. Each member is part of a different team normally, which meant we had a variety of knowledge within Radiators to begin with. This was also a good excuse to work with new colleagues, forming new relations - which can be challenging whilst remote working.

What’s the goal?

The goal was to create a well documented set of guidelines on the types of testing we carry out in Vodafone, outlining when to use one approach over the other, listing its advantages, disadvantages and best practices.

The three main aspects we were looking to cover were:
- Unit Testing
- Integration Testing
- End to End testing

This was the perfect chance for us to iron out any bad habits that were consistent in our current testing strategy. For example, when carrying out visual regression testing, must all test cases be handled with a snapshot, or can we simply use an assertion test (in such instances where the user interface isn’t impacted)? This also gave us an opportunity to research other testing frameworks and tooling that we hadn’t initially considered, to ensure that we are following the upmost best practices and staying on top of the game.

How did we tackle this?

Throughout the fortnight allocated to this work, we spent time analysing the number of existing repositories, digging deeper to investigate their ways of testing. We reached out to the developers involved with the decision making to get a better understanding of their selection (who now probably hate us with the amount of times we have distracted them #SorryNotSorry).

A visual representation of us approaching the lead developers

From the feedback we received, we put together a set of proposals outlining the best approaches for each testing scenario and the tools aligned to it. For example, a common pattern we saw within many repositories was whether unit tests should cover click simulations or not; some developers felt strongly on each side of the argument. After some research, we found out that click simulations within Jest and Enzyme will soon be deprecated, which encourages us to move away from this method.

So, all we did was research and document out strategies?

Not quite.

The two weeks also gave us an opportunity to investigate some new tools: Cypress, an end-to-end testing framework, and Applitools, a visual regression testing tool. Applitools can be integrated with Cypress and together, provides the cross browser testing support that is required to support our application.

We were looking at Applitools to improve our current visual regression testing framework - a bespoke application, which inevitably brings a challenge when trying to understand how it works and attempting to troubleshoot, due to the lack of documentation.

The initiation of Applitools came two days before the end of our two weeks, so although we were time boxed, we were able to achieve the following:

  • Ran a workshop with Applitools, learning the basics of testing with Cypress and Applitools.
  • Set up Cypress and Applitools in an existing repository, taking some simple snapshots and understanding how the Applitools console works.
  • Get Applitools and Cypress running in the pipeline, which adds an extra layer of security before merging code into master, and reduces time wasted running tests locally.

We also took part in workshops on other testing principles such as BDD (Behaviour-Driven Development) and understanding how to write test cases using a BDD approach with Cypress, along with Cypress Masterclass Workshops, learning the best practices and techniques when using it in our application.

What were our proposals in the end?

Below you will find a brief summary of some of the key aspects we put together.

Unit Testing:

Technology stack: Enzyme, Jest, React-Testing-Library

  • Created a template for test files with a basic structure of how the file should be formatted.
  • Created a standard of how to mock functions with Jest.
  • Avoid using NPS commands for any scripts execution - this adds unnecessary complications when trying to understand what a command actually does.
  • Set the unit testing threshold to be over 90% - this means that no code will be merged into master if it fails to meet this requirement.
  • Enzyme vs React-Testing-Library - both has its advantages; for example, Enzyme’s limitations on testing React Hooks means that React-Testing-Library is preferred for functional based components.
  • Shallow vs mount vs render - use shallow to test the component itself and not its children, use mount when needing to test the full DOM element (with the children), use render for a basic check to see if the element exists.
  • Avoid using click simulations in Enzyme as it is due to be deprecated. Instead, invoke the function in that prop.
  • Whether to write our tests in TypeScript or JavaScript — both has its positives; JavaScript keeps it less complicated with the exclusion of type checking, as unit tests should be testing the behaviour more than the types, however writing the tests in TypeScript will encourage better care to the quality of the tests and the tests will break alongside the code if the types are incorrect.

Interaction Testing:

Technology Stack: Cypress

  • Created a step by step guide on how to set up Cypress.
  • Set up a basic folder structure for Cypress and its test files.
  • Created a template for the test files, with pre-defined common naming conventions.
  • Standardised the viewport sizes to test responsiveness across mobile, tablet and desktop screens.
  • Documented how to apply BDD to Cypress.

Visual Regression Testing:

Technology Stack: Applitools

  • Created a step by step guide on how to set up Applitools.
  • Documented the key aspects of the Applitools console, its main features and defined its purpose.
  • A list of advantages and disadvantages on when to use visual regression testing, stating the huge benefit it brings to the quality of our work, with the extra security knowing that our code hasn’t unintentionally affected the user interface.
  • When to take a snapshot and when to rely on assertion testing; snapshots should only be taken when there are visible changes to the user interface (such as pop ups or new content rendered), most other cases should be covered using assertion tests.

So, what next?

Our set of guidelines is now complete! We presented our findings to the wider team which received positive feedback, implying we were all aligned with the decisions made and the arguments for each of the tools and standards we should follow.

This will now be used as the criterion for any new repositories created, and we will eventually update our existing repositories to these standards. Every repository will need a testing strategy, so our guidelines will encourage a decision based on their requirements.

So… is that it? Job done?

Of course not, this is only the start!

We will regularly be updating our guidelines as new technology evolves, keeping up to date with the new features from our current testing technology stack, following its highest standards.

We will look to pinpoint some of the suggestions we made above; for example, we may look to write all of our tests in TypeScript, or eventually move away from simulating clicks (this will be a must!), but this will require further investigation before globalising this across the whole of Vodafone UK.

Although it may seem like a small step on paper, this is a massive win for us in order to try and set the standards for what’s to come next - our current and future colleagues will thank us later…

--

--