How to Build E2E Test Cases

Dan Widing
ProdPerfect
Published in
5 min readFeb 6, 2020

Developing end-to-end (E2E) test cases poses a substantial challenge compared to writing unit and API test cases. Blocks of code and APIs can be tested against a well-defined, limited, and predetermined set of business rules. Test-driven-development (TDD) techniques can empower developers to write relevant tests alongside their code.

Developing E2E tests requires a fully different approach. This level of testing is meant to replicate user behavior that’s interacting with many blocks of code and multiple APIs simultaneously. Below we recommend a process that will help you build accurate, effective test cases for your E2E testing regime. Note that we will not cover test scripting here, but only test case development.

There are four considerations, each of which will be explored in turn:

How to Scope E2E Testing

The goal of E2E testing is to make sure that users can use your application without running into trouble. Usually, this is done by running automated E2E regression tests against said application. One approach to choosing your scope could be to test every way users could use an application. This would certainly represent true 100% coverage. Unfortunately, it would also yield a testing codebase even larger than the product codebase, and a test runtime that is likely as long as it takes to write the build being tested.

Senior QA Engineers are often used to determining scope as well. Combining experience, knowledge of the code-base, and knowledge of the web app’s business metrics, a QA engineer can propose tests that stop your users from encountering bugs when performing high-value actions. Unfortunately, “ is the weakness of this approach: biased understanding of the web app, cost, and reliance on one individual leads inevitably to bugs making their way into production.

The team should therefore instead test only how users are using the application. Doing so yields the optimal balance of achieving thorough test coverage without expending excessive resources or runtime or relying on an expert to predict how customers use the website. This approach requires user data, rather than an expansive exploration of the different feature options in the application, to manage. To mine user data, you’ll need to use some form of product analytics to understand how your users currently use your application .

E2E testing should not replace or substantially repeat the efforts of unit and API testing. Unit and API testing should test business logic. Generally, a unit test ensures that a block of code always results in the correct output variable(s) for given input variable(s). An API test ensures that for a given call, the correct response occurs.

E2E testing is meant to ensure that user interactions always work and that a user can complete a workflow successfully. E2E test validations should therefore make certain that an interaction point (button, form, page, etc) exists and can be used. Then, they should verify that a user can move through all of these interactions and, at the end, the application returns what is expected in both the individual elements and also the result of user-initiated data transformations. Well-built tests will also look for javascript or browser errors. If tests are written in this way, the relevant blocks of code and APIs will all be tested for functionality during the test execution.

Which User Flows to Follow

The risk of a bloated test suite, beyond high maintenance cost, is runtime that grows too long for tests to be run in the deployment process for each build. If you keep runtimes to only a few minutes, you can test every build and provide developers immediate feedback about what they may have broken, so they can rapidly fix the bug.

To prevent test suite bloat, we suggest splitting your test cases into two groups: core and edge. Core test cases are meant to reflect your core features-what people are doing repeatedly. These are usually associated with revenue or bulk usability; a significant number of users are doing them, so if they fail you’re in trouble. Edge cases are the ways that people use the application that are unexpected, unintended, or rare, but might still break the application in an important way. The testing team will need to pick and choose which of these cases to include based on business value. It’s important to be careful of writing edge case tests for every edge bug that occurs. Endlessly playing “whack a mole” can again cause the edge test suite to become bloated and excessively resource-intensive to maintain.

If runtime allows, we recommend running your core and edge tests with every build. Failing that, we recommend running core feature tests with every build, and running the longer-runtime edge case tests occasionally, in order to provide feedback on edge case bugs at a reasonable frequency.

How to Design Test Cases

Every test case, whether it is core or edge, should focus on a full and substantial user experience. At the end of a passing test, you should be certain that the user will have completed a given task to their satisfaction.

Each test will be a series of interactions, with an element on the page: a link, a button, a form, a drawing element, etc. For each element, the test should validate that it exists and that it can be interacted with. Between interactions, the test writer should look for meaningful changes in the application on the DOM that indicate whether or not the application has responded in the accepted way. Finally, the data in the test (an address, a product selected, some other string or variable entered into the test) should be used to ensure that the test transforms or returns that data in the way that it’s expected to.

If you build E2E test cases with this process in mind, you will achieve high-fidelity continuous testing of your application without pouring unnecessary or marginally-valuable hours into maintaining the suite. You will be able to affordably ensure that users can use your application in the way they intend to do so.

Originally published at https://prodperfect.com.

--

--