Automated UI testing for Carousell Web

Syam Sasi
Carousell Insider
Published in
7 min readApr 7, 2022

--

Carousell started off as a mobile-first product, and that was why we started automating mobile apps before the web. We built Caroufarm, our in-house virtual device farm, to support the mobile app testing and the demand for web automation became evident very soon after.

In this article, we will tour the challenges we faced while implementing web automation, and how we integrated the desktop and mobile browser testing as part of the continuous delivery pipeline.

Challenges

No end-to-end UI tests were available for the web platform

Before we automated UI tests for the web, the release engineer had to manually verify the critical flows for each daily release. This was done so, even though there was more than 80% unit testing coverage which tested deeper business logic, as it did not provide enough real-world UI testing of feature flows against actual backend services.

We needed to support both desktop web and mobile web

Carousell UI flows on desktop web (dWeb) and mobile web (mWeb) are different because our product experience is separately optimised for each screen size.

No dedicated test environment to run automated tests

Our team was using a shared staging environment and it was flaky due to multiple service deployments at same time. Sanity checks were mostly happening only when production-ready code was tested in a canary instance.

Solution

We didn’t want to reinvent the wheel for web automation testing, and we already had a well-established mobile automation framework and testing infrastructure, which we decided to extend to cater to web testing.

We have around 250+ scenarios automated for our Android and iOS apps, and most of these test cases could be reused in desktop and mobile web as well.

Simulating mobile views in Chrome

With the help of the following code, it is easy to mock a mobile view in Chrome. mobileEmulation can be configured as your preferred device and Chrome will start with the configured viewport properties.

Accessing the camera on mobile web

Carousell Sell Flow on iPhone Safari

The sell flow is one of the critical features we wanted to automate, and it includes the snapping of photos using the phone camera. The trick here is to switch to the native image picker view and access the camera using AppiumDriver, and then switch back to the web view to continue the tests.

Minimising complexity with a business layer

  1. Step Layer

The step layer is a step class group to define common step methods for each platform (iOS, Android, dWeb, mWeb).

At Carousell, we are reusing the same Cucumber test scenarios for all four platforms. The Cucumber scenario steps have a one-to-one mapping with Cucumber step methods.

2. Business Layer

The business layer is a router class group to manipulate different page objects in the step layer’s common method. Instead of calling page objects in step classes, we are calling business class objects.

Chat is an abstract class, and the business logic for each platform is implemented at the subclass level. The corresponding sub class is initiated based on which platform is configured for testing.

For example, “marking as sold” may not have the same business flow for all four platforms. By leveraging the abstraction, the different user actions needed for a business flow can be controlled in corresponding platform-level sub classes.

When the chat business class calls markAsSold() and if the current run is configured for mobile web, then markAsSold() in ChatMWeb would be executed.

If there is a tutorial page that has to be dismissed before marking a listing as sold on mobile web, this logic is handled in the ChatMWeb subclass.

Abstract Chat business layer

Business layer subclass for Desktop Web

Business layer subclass for Mobile Web

3. Page Object Layer

The page layer is a page class group to manipulate user behaviours and interactions within the page. The elements are defined here with the page object pattern. And since the UI layer is different for both desktop and mobile web, we have different page objects for them.

4. Page Component Layer

The page component layer is a page component class group to manipulate the group of UI components on the page. If there are common web elements which are being used across multiple pages, we form those as component layers and include them inside the page object.

Left: Home Page, Right: Profile Page

The search bar is common for both home page and profile page. Instead of treating the search bar as two different elements on the home page and profile page, we implemented the search bar as a page component.

And now we can use HeaderToolbar directly in the HomePage class.

The best locator strategy

source — https://imgflip.com/memegenerator/66359926/In-Line

There are test engineering teams who work standalone and do not have the luxury of communicating with development teams often. This may make it more challenging to get unique identifiers on web elements for automated testing. Relying on unstable locator strategies leads to unstable tests.

But if you are sitting with the development team, just ask your team to help add a unique id/attribute on these web elements.

Our awesome web engineers at Carousell have helped to standardise on data-testid as a dedicated attribute on HTML elements. This approach makes the tests less fragile and increases confidence in their stability.

Cloud service evaluation

The next step we needed to do was to evaluate which cloud service provider suits us in terms of web testing. Despite the fact that we are already using BrowserStack for mobile app automation testing, we wanted to give a fair chance to all leading vendors.

Data as of March 2021

We came up with a selection criteria and evaluated BrowserStack, Experitest, LambdaTest, pCloudy, Sauce Labs, TestingBot and Testsigma. We scored these vendors against our requirements accordingly: well supported (green), partially supported (amber) and no support (black). After careful evaluation, we chose Browserstack for our web automation needs.

Extending Caroufarm’s test infrastructure for web

Reusing our existing test infrastructure was pretty straightforward. We added the headless Chrome to the existing Docker container and used the executable Jar for running web tests and sending to each container. With the help of AWS SQS, separate queues are maintained for test requests and result queues. The status of each test scenario is updated with these queues.

Caroufarm web in action

Enabling UI tests in the release pipeline

Karowl is our in-house pipeline framework which handles the deployment of both services and web applications. Once the web application has been deployed to a dedicated QA environment, the pipeline triggers the UI tests, which then executes the desktop and mobile UI tests on BrowserStack.

The pipeline waits for the result and proceeds to canary deployment if all tests are passing. The test framework will send Slack notification to the appropriate channel to update our web engineers about the status of release tests.

If any of the tests fail, the test engineer is tagged in the notification, and the test engineer has to analyse if the test failure is due to a bug or false positive.

The Slack message contains the following information:

  • dWeb or mWeb
  • UI test Jenkins job link
  • Karowl deployment job link
  • Release engineer
  • Summary of test results
  • Link to the test results (on Zephyr Scale)

The impact

We collected the following metrics to measure the effectiveness of automation after the first 30 days.

  1. Total time saved by automation: Web release engineers would typically have to spend 11 hours 35 minutes and 20 seconds to verify the core flows manually if there were no automated tests.
  2. Time taken to complete a test cycle: We are running 40 test scenarios in the release pipeline at the moment. With the help of parallel runs, we managed to bring the total execution time to under 12 minutes.
  3. Number of critical issues found: The test suite prevented a critical issue which could have slipped into the production environment.
  4. Passing rate: The test suite gave a thriving high stability of 98.98% pass rate.

Feedback from web platform engineers

And big thanks to the Test Engineering Team for making this happen.

--

--

Syam Sasi
Carousell Insider

Senior Software Engineer @ Carousell, TAQELAH Co-Founder