Key test automation takeaways for software testing professionals

Zelanda van Drünick
Saratoga Software
Published in
6 min readSep 12, 2022

By James Neethling, QA Competency Lead at Saratoga

When our clients approach us to help them with automating tests, they almost always mean browser-based User Interface (UI) testing and it’s usually for either smoke testing or regression testing. We recently undertook one of these projects for an international client and documented some of the practical considerations to enable effective test analysis when working on UI automation projects.

This specific project was with a client in the financial services industry. They’re improving their development agility and recently embarked on the next phase of that journey by moving their on-premise infrastructure to cloud infrastructure. The client already had the foundations of the continuous deployment processes implemented and they recognised the potential benefits of automating a smoke-test pack to be able to run after promoting functionality between environments. As an automated process, this would take significantly less time than running these tests manually.

The Saratoga Quality Assurance team were tasked with creating this automated test pack. The application was a multi-tiered web-based front-end, using an intermediate API layer to communicate with a legacy back-office application. While the application had a fairly typical architecture which provided many integration opportunities, not all the functions and capabilities of the back-office application were exposed, and this presented a technical challenge to the team.

Although there was a set of functional test cases which were being manually executed, there wasn’t a consolidated test pack performing sufficient testing of key business functions.

Our test automation process

Our approach for this project was to use a time-boxed, iterative development process which included both analysis and development. We had been provided with an initial set of functional test cases that we needed to automate, and the team loaded these into the backlog and worked with the client to prioritise the test cases.

Within two sprints we realised that the functional tests were not very helpful in accurately understanding the underlying business value the test case was testing. This meant it was difficult to have a conversation about how to prioritise and to determine how the test cases could be optimally sequenced to minimise the amount of test set-up (e.g. data) required. So, while the Test Analyst focussed on gaining a deeper understanding of the Application Under Test (AUT) and to create test cases which reflected the user journey through the application, the Automation Engineer unpacked the API’s and wrote helper functions to create the test data to support the test cases.

This required a reassessment of the scope of the engagement to ensure that we could achieve the expected outcome of the project — i.e. UI smoke tests to confirm a deployment. With the improvements in the test scripting time, and the consolidation of the various test cases into scenarios that minimised test data creation, the team were able to project the number of test scenarios and business processes that would be covered and continually updated these predictions.

This project highlighted some key considerations for software testing professionals

1. UI automation isn’t the only form of test automation

It may seem trivial to point out, but for many users, seeing the browser clicking around and inputting data is what they think of when they talk about test automation. However, the field of test automation can apply to automating a vast number of testing types, so be sure you’re using the most appropriate interface when automating test cases. It is always faster to implement lower-level test automation to test supporting API and system interfaces than automating the UI.

2. UI browser automation is the ‘easy’ part

Stay with me here! The development effort in controlling the browser to automate test execution is the easy part of the development process — but that doesn’t mean that it’s easy. Significantly more challenging is ensuring that the application under test is in the correct initial state (pre-conditions) for the test results to be relevant and designing tests that are sufficiently isolated to provide actionable information. Often, we find that the application wasn’t designed with testability in mind. In these cases, it is a real challenge (and expensive) to retrofit the automated test cases.

3. Don’t skip Test Analysis

Getting a good inventory of the test cases that are needed allows for more effective prioritisation and packaging of the test cases to optimise the test set-up (e.g. data) requirements, while still delivering the actionable information about the quality of the Application Under Test. A few minutes spent in the Test Analysis phase can streamline the development and promotes the re-use of test data — saving significant effort. Good test data design can also dramatically reduce the maintenance costs of automated tests and inform the difficult trade-offs between fully isolated tests and the cost to the test set-up and tear-down.

4. The application architecture matters

The team implementing the test cases needs to have a good understanding of the application architecture and the tooling at their disposal. This will enable them to make informed assessments of how best to put the application into the desired state to run the test. This may be through API’s, direct access to the underlying data store(s) or via other mechanisms. These approaches must be weighed against the development and maintenance efforts — and it is here where the experience really helps.

Understanding the architecture and the specific implementation of the application capabilities will also help with packaging test cases together into scenarios that allow for the re-use of data and improve execution times. An agile approach to the automated test cases allows the team to progressively mature the test cases as the whole test pack matures, iteratively reducing fragility.

5. Planning for the limitations of test packs to retain the integrity of testing outputs

Automated tests are, at their heart, computer programs and are therefore impacted by the same constraints that development teams are very familiar with. Expert judgement is required to know the difference between over-engineered test cases and taking ‘short-cuts’ that will impact test quality later.

It is easy to blur the lines between the different types of testing and the information that they provide about the overall system quality and maintaining the discipline of the value of smoke-tests is important. In our example of automating the smoke-test pack, the team frequently went back to the objectives of the smoke-tests to support decisions about which test cases to include or defer to other types of tests.

This is typically the ‘happy path’. It may be necessary to take short-cuts in the creation of the test cases or to bypass parts of the process that have timing impacts which make it difficult or expensive to test completely. For these situations, detail the risks that the short-cuts will present and make them explicit in the test case design and in the test script (code). Rely on other testing methods to provide detailed information about those functional areas.

With this article, we’ve touched on some of the key takeaways and lessons to be learnt from test automation projects. Just as fast as the development tools and deployment patterns are changing, so too are the techniques and strategies within the field of software testing and quality assurance — and there are many more lessons to learn and share.

For more insights from our Quality Assurance team, read ‘Common misconceptions about software testing’’, by Shinaaz Osman, Senior Quality Assurance Consultant at Saratoga.

About Saratoga

By combining our decades of software engineering, business analysis, project management and software testing expertise with our pool of phenomenally talented professionals we work with organisations to see great technology ideas successfully delivered.

Working with our global client base, we convert business ideas into quality technology solutions which create meaningful business benefits.

--

--