We recently completed a project where Katalon Studio played a significant role in the project’s overall success. I have long been a believer in the benefits of incorporating a degree of automated UI testing in our development process, but until now have been defeated by the cost or time (or both) required to author robust UI tests.
Our Use Case
We were approached by a client with a large, undocumented, 15 year-old PHP application. They needed to upgrade it to a supported PHP version (7.3 in this case) and resolve some significant security issues that were prevalent throughout the codebase. A rewrite may well have been the technically correct solution, but they did not have the time, budget, or inclination to pursue this. We needed a way to deliver the work in a 3 month timeframe at low cost. The only way to achieve this was to script the necessary changes to the codebase, but this still left us with the problem of both validating the security changes we made and performing regression testing on the rest of the solution. It was obvious we needed to automate this testing to be able to move quickly. We chose Katalon Studio because it was free, provided decent test recording capabilities and, most importantly, provided an object repository for UI elements.
Our approach was as follows:
- Write a test suite (where all tests pass) against codebase as it was provided to us, covering as many paths through the application as we could identify;
- Apply an individual security or upgrade change script;
- Re-run the test suite;
- Investigate and resolve and failing tests;
- Repeat steps 2 to 5 until complete.
What is Katalon Studio?
Katalon Studio is a full test automation solution for web, mobile and API testing. It includes a full Integrated Development Environment for recording, authoring, debugging and executing tests. For web testing (our focus on this project), it is built on top of the open-source Selenium framework. One of the issues I’ve experienced with Selenium in the past is in managing and reusing UI elements. Projects in Katalon Studio include an Object Repository that goes some way to alleviating this problem.
Our Team and Environment
Our original hope was that we would be able to use a Product Owner in our company to do most of the test creation. Due to resourcing constraints that was not possible, so test creation was performed by the developers who would subsequently work on the system changes. Katalon Studio and the test solution was running on Macs, with the PHP solution running inside Docker containers (on our Macs for development, and deployed on VM’s for integration and QA testing).
Experiences Authoring Tests
For the first few test cases, we used the Record and Playback feature of Katalon Studio, in the hope that we could record whilst we explored the system and then have a set of tests to execute forever more. Sadly it didn’t quite work out like that. For simple user interfaces this may work like a charm, but unfortunately the system we were working on was complex, with horrendous hierarchies of nested HTML tables and other anti-patterns that revealed the system’s age. We found that the recorder fairly regularly did not pick up an interaction with an element on the page. This is pretty frustrating when you record a long script but discover on playback that it doesn’t work, requiring you to use the Web Spy to capture the element that was missed and manually add it into the script.
We also quickly realised that we wanted to modularise our tests, which using the Record feature made difficult. For example, we wanted all our tests to be self-contained, using the following pattern:
- Perform an action (basically add, edit or delete a record, or view a list and select an item to view details, etc);
- Verify the action was performed successfully;
- Reverse any data changes that occurred;
We quickly tired of recording the Login and Logout steps for every test case. The system also had different organisational levels that had to be selected after logging in. We also got fed up with recording that step, especially since this feature was one that the Recorder had a lot of issues with.
This meant we ended up switching to using the Web Spy to capture elements into the Object Repository and then wrote our tests manually. At first we used the manual UI provided by Katalon Studio, but for us it ended up being much easier to create and maintain the tests in Script Mode using Groovy. As we became more comfortable with the testing flow we created a basic test case template that we used to start each test. Our workflow became:
- Explore the feature we want to test, map out the basic flow of user interactions in this feature.
- Turn on Web Spy and capture all the elements on the pages for the feature under test.
- Review the captured elements, adjust names, adjust the XPath or selectors where appropriate.
- Save the captured elements into the Object Repository.
- Create a test case from our template, switch to Script Mode and add the interactions and verifications.
- Execute the test, verify the behaviour of the feature under test and correct if necessary.
Whilst undoubtedly leaving room for improvement, this approach allowed us to create about 157 smoke tests to validate the behaviour of the entire system (over 250 separate HTML pages) in 35 resource days, or around 4.5 test cases per person per day.
The Good, the Bad and the Ugly
Having completed our first full-on project with Katalon Studio, our team has had time to develop strong opinions on what they like and don’t like about the product. Of course, this is based on our current level of understanding (or ignorance) of the tool and our particular use case for this project.
- Undoubtedly the best bang for your buck amongst UI testing automation. Regardless of everything else we have to say about Katalon Studio you must not lose sight of the fact that this is a free solution in a field containing some crazily expensive offerings. Does that mean you have to accept some rougher edges than you would otherwise? Yes. But it definitely saved us a lot of time versus working directly with Selenium and we would be unlikely to be able to justify the move to a commercial solution in the medium term.
- Fairly low learning curve. Whilst we didn’t end up using the Recorder functionality for the full duration of the project, it helped us get up and running quickly. Everything we subsequently learnt about manual tests and writing test scripts in Groovy came about because we could inspect our initial tests to see how they were put together. For simple systems, or for just quickly getting some smoke tests in place, the Record feature may be all you need.
- The Object Repository. Many years ago, trying to write UI tests meant painstakingly adding XPath expressions into your test automation code. If you were a developer, you quickly realised that you needed create some kind of library for your UI elements, but you needed a benevolent project manager or tech lead to grant you the time to set that up. The Object Repository provides a good way of getting started with this kind of library. Without it, Katalon would not have been a viable choice for our needs.
- The Object Spy. We really found this beneficial. Tying in nicely with the Object Repository, being able to quickly grab the XPath expression or other selector for an UI element was great. It comes with some quirks, we occasionally found it not registering a capture command or becoming very slow and laggy. Our biggest gripe was the choices Object Spy made regarding the object selector to use by default, particularly when it would ignore id or name attributes on the element and construct a selection expression based on less consistent attributes. We suspect this may be due to Katalon launching a “Relative XPath” feature which was in Beta during our project, so this should improve over time.
- Profiles, Data-Driven Tests and Test Suites. We made use of all three of these features to save us plenty of time, but we probably didn’t even scratch the surface of their full potential. Test Suites quite simply allow a set of test cases to be grouped together, so we could run all the test cases for a particular module of the application. We also used a simple Data File to allow each test case to be executed for each of the different levels of access the application allowed. Finally, Profiles allowed us to execute the test suites against different environments, like our local development environment or the QA environment.
- Not developer focussed (files and structure). Look, this is a matter of personal preference and we completely understand that developers might not be the top target user for Katalon Studio. That said, there were some choices that have been made that certainly irritated us as we used the product. In particular, Katalon Studio creates a huge number of files behind the scenes and you are not aware from the UI what files are being changed and where they are. For example, a test case will have an XML file created for it in the Test Cases folder with the name of the test case. But the actual test steps are defined as groovy in a separate file (with a randomly generated name) in the Scripts folder. This was an issue for us when committing our test changes to git, as it made checking that only relevant changes were being committed more difficult than it need be.
- Not developer focussed (magic strings). Again a matter of personal preference, but given we were developing most of our tests in script mode by the end of the project, having to reference UI elements or custom keywords using a string key was not ideal. First prize here for us would definitely be a strongly typed Object Repository, perhaps accessible using Page.ElementName syntax. The drag and drop functionality from the Object Repository (it puts the findTestObject(‘object name’) text into the script for you) is nice, but as developers we tend to not want to leave the keyboard when we are writing code.
- Klunky UI. We can split this into two parts. First, the UI is built on Eclipse, which is very popular, but personally not something I want to be working with every day. There were a couple of real irritants here. The mouse pointing is imprecise. Think you are clicking on that dropdown arrow next to the run button? Nope, you’ve clicked the run button and your test is going to execute with the default browser. Dragging and dropping has a similar lack of precision, especially when you are working in script mode. The air in the office regularly turned blue when we attempted to drag an object from the repository into a script only to find that it actually got added halfway through the line above or below.
The second major irritant was the need to double click into boxes to edit and explicitly clicking out of them again or your change would be lost. This is just a terrible, terrible user experience. I’m not sure we would have continued using the application if we were stuck with using manual mode and dealing with that quirk.
- Random bugginess in the Selenium web drivers. This is very frustrating and cost us quite a bit of time. Periodically, and apparently randomly, test cases would fail because of some transient issue with the web driver. Here are some examples:
- A particular element would become un-clickable (there was a scrollbar near by but not obscuring the element). Restarting the browser, web driver and Katalon would often resolve the problem.
- Random errors in the middle of a test case with messages like “Couldn’t connect to web driver” or “Timed out waiting for a response from the web driver”, etc.
- Irritating inability to save preferences. I do not want the welcome screen being opened every time I open Katalon Studio. Cool, there is an option to stop this. I have clicked it and saved it many, many times. Still the welcome screen opens every time Katalon Studio does. There were a few settings like this. No deal breaker, just irritating.
- Who moved my cheese?. Once upon a time there was a section in the tree view of Katalon Studio for test reports. This could get quite big and unwieldy as you executed more and more tests, but was useful to us because we could look at previous test runs easily to understand if we were seeing some transient issue or if there was a legitimate problem that needed to be resolved. Then we installed a minor update. Suddenly the section was gone. The report files were still there in the filesystem, but the only reports we could see were for the last run of each test suite. This was unexpected from a minor version.
Of all the irritants (or areas for improvement), the prime one for us was around performance and how it degraded over time. There are a lot of moving parts to a test execution, from Katalon itself to the web driver to the browser being used for a web test. Whilst we were in test authoring mode and regularly executing tests, we found that the performance of our systems degraded over the course of a day. Generally a reboot would resolve this, but in one extreme case the only solution we found was to completely uninstall and reinstall Katalon. Given that our machines are high-spec developer ones, we are concerned as to whether we can roll this tool out to other members of our project teams without upgrading their machines.
This article has set out our initial experiences of using Katalon Studio to automate the testing of a brownfield web application. Katalon Studio is not perfect, but it is free, and we found it an invaluable tool for the project we were working on. In fact, it is safe to say that we would not have been successful with this project if we hadn’t invested in automating UI tests up front, and Katalon Studio was the only feasible way to achieve this.
Automated UI testing is an important part of Quality Assurance mix for a software project. Whilst the proportion of UI tests to unit tests will vary from project to project, at least some basic smoke tests to verify releases should be mandatory. If you or your company do not have the budget to spend on one of the commercial offerings, Katalon Studio is, in our opinion, the best “free” option. The trade-off is that, like most free tools, the cost of working efficiently with the tool is not insignificant. Finding an experienced user to help you get up to speed is an excellent idea.
DataDIGEST provides small, tightly-focussed technical teams that get things done, fast. We help companies solve business problems with business solutions in the cloud. Visit our website to find out how we can help you build and launch your web or mobile MVP, get your software project back on track, or increase the efficiency of your internal business processes with Office 365 and/or Dynamics 365.