Improve Your Testing Strategy: Our Parallelization and Reporting Journey

Kubra Cebbar
Trendyol Tech
Published in
8 min readOct 23, 2023
Image from www.freepik.com

When working in a large team like Trendyol, it is important to write UI (User Interface) tests to test user experience and functionality and then speed up your UI tests and visualize the results better. As the Seller Growth team, we wanted to improve the UI tests of our Growth Center, Strike Point, Notification Center, Calendar, and Seller Badge screens on the Trendyol Seller Panel. Our goal in these screens is to help sellers grow their stores, assign various tasks to encourage sellers to actively use all the features on the panel, and facilitate their level progression, thus increasing their interaction with customers and the panel.

In this article, you will learn step-by-step how to parallelize and effectively report your UI tests using Cypress.

Before starting this article, I would like to talk about why we, as Seller Growth team, decided that we should run UI tests in Cypress and why we should run them in parallel;

  • Cypress allows you to monitor tests in real-time and visually see the testing phases on the application. This provides a huge advantage to identify and debug bugs quickly.
  • Cypress runs in real browsers, helping tests reflect the real user experience of the app.
  • A large open-source community supports Cypress. This community is constantly working to solve problems and improve features.

After we decided to create our UI tests in Cypress, we started with our biggest project, Seller Growth Center screens. Here, we have a screen that combines two projects with the goals assigned to the seller and the levels of the sellers. Our test project started to host many tests, and in this case, it turned into a big, complex structure, and the combined test suites started to take more and more time with the increase in the test cases. So, the UI testing of the page had to be divided into many suites: seller level suite, challenge list suite, challenge detail suite, and so on…
We decided that we should run the tests in parallel in order to better manage these test suites, shorten the run times by starting to run these suites at the same time, reach the error faster, and make the development process more efficient.
Our biggest concern here was integrating the parallelism step into Cypress, our pipelines, and creating a single report for each suite while reporting. I will explain how these concerns were resolved in the rest of the article.

Growth Center Screen

After creating the project in Cypress and writing the tests, it’s time to see how we run our tests in parallel.

Parallelization

First, we started with Cypress’s documentation and found that Cypress Cloud could do this, but we had a problem: we had to pay a fee for the Cloud and record the tests to Cypress. We didn’t want to create a membership, and we had questions about whether this would create a security vulnerability, so we continued to investigate.

Then we met sorry-cypress, an alternative to the open-source Cypress Cloud. sorry-cypress provides us with a service where we can run tests in parallel for free by using currents, a cloud-based, production-grade and affordable alternative to Cypress’ cloud system.

First of all, according to the configuration shared in the document “agoldis/sorry-cypress-director:2.5.10” we set up this docker as a pod.

Then, we started downloading the necessary dependencies for our cypress project.

npm install cypress-cloud cypress

Initially, we defined the service address we removed on the GitLab side and set it to read in the pipeline as variables in the .gitlab-ci.yml file in the project.

variables:
CYPRESS_API_URL: $CYPRESS_API_URL

Before explaining the other steps individually, I would like to mention our problem in this step. When we complete all the steps and want to run the tests,

We could not find a Cypress Cloud project with the projectId: ****
This projectId came from your cypress.config.js file or an environment variable. Please log into Cypress Cloud and find your project.

we have encountered such an error.

At first, we thought that there was someplace where we couldn’t bypass Cypress Cloud, but as we continued to investigate, we saw people here https://github.com/sorry-cypress/sorry-cypress/discussions/787 having the same problem as us and we found out that the problem was that Cypress was blocking the use of sorry-cypress and only wanted it to run on lower versions of Cypress, as a solution to this, the sorry-cypress team suggested that we create a new config file and set the service URL we removed directly and we started to proceed in this way.

We created a new config file (currents.config.js) where we gave a random string value for projectId and recordKey, and for cloudServiceUrl we gave the sorry cypress URL that we hosted in the first place.

// currents.config.js
module.exports = {
projectId: "yyy",
recordKey: "xxx",
cloudServiceUrl: "http://localhost:1234",
};

The next step was to add cypress-cloud/plugin to cypress.config.js.

// cypress.config.js
const { defineConfig } = require('cypress')
const { cloudPlugin } = require('cypress-cloud/plugin')
module.exports = defineConfig({
projectId: 'yyy',
viewportWidth: 1366,
viewportHeight: 1080,
e2e: {
async setupNodeEvents(on, config) {
const result = await cloudPlugin(on, config)
return result
},
},
})

We then added a new script to our package.json file to run the parallel command for our tests.

“cy:run:parallel”: “npx cypress-cloud — parallel — record — key $KEY — ci-build-id $SORRY_CYPRESS_BUILD_ID”

Here, we set the value we gave as the record key to the KEY value, and to the SORRY_CYPRESS_BUILD_ID value, we set the time each job runs as a unique id value that we defined in our setup step in .gitlab-ci.yml. We added the parallel line to our run step, indicating how many pods we wanted to run in.

variables:
SORRY_CYPRESS_BUILD_ID_FILE: ./sorry_cypress_build_id.txt
Setup UI Test:
stage: Setup UI Test
script:
- echo "export SORRY_CYPRESS_BUILD_ID=$(date +%s)" > $SORRY_CYPRESS_BUILD_ID_FILE
artifacts:
paths:
- $SORRY_CYPRESS_BUILD_ID_FILE
expire_in: 2 mins
Run UI Tests:
stage: Run UI Tests 🚀
parallel: 6
before_script:
- npm ci
- export BROWSER=chrome
- export KEY=***
- source $SORRY_CYPRESS_BUILD_ID_FILE
script:
- npm run cy:run:parallel
- mkdir cypress/reports/mochawesome_$CI_JOB_ID
- mv cypress/reports/mochawesome/* cypress/reports/mochawesome_$CI_JOB_ID
artifacts:
expire_in: 1 day
when: always
paths:
- cypress/videos/**/*.mp4
- cypress/screenshots/**/*.png
- cypress/reports/mochawesome_$CI_JOB_ID
Parallel Pod Screen

Now, our tests are ready to run in parallel. We can now see that our pipeline time is 1 minute faster. The difference will increase according to the number of tests written here.

Parallel and Non-parallel Test Pipelines

We have created our test project and ensured that our tests run in parallel. It’s time to see how we can report our test results.

Reporting Tests

We started to investigate what we could do to see and review the results of our tests. We found that Cypress provided us with easy, simple and understandable reports, videos, and screenshots with mocha, and we also ensured that the whole team could follow the results with Slack integration.

First, we download our necessary dependencies for mocha usage.

npm install mochawesome 
npm install mochawesome-merge
npm install cypress-multi-reporters
npm install cypress-slack-reporter

Then we create a new reporter-config.json file to add our config information and add our scripts to package.json.

//reporter-config.json
{
"reporterEnabled": "mochawesome",
"reporterOptions": {
"quiet": true,
"reportDir": "cypress/reports/mochawesome",
"overwrite": false,
"html": false,
"json": true,
"charts": true
}
}
//package.json
{
"mochawesome:report": "cypress run --reporter cypress-multi-reporters --browser $BROWSER --reporter-options configFile=reporter-config.json",
"mochawesome:merge": "mochawesome-merge \"cypress/reports/mochawesome/*.json\" > mochawesome.json && npx marge mochawesome.json && npm run mochawesome:move",
"mochawesome:move": "mv mochawesome-report cypress/reports/mochawesome && mv mochawesome.json cypress/reports/mochawesome/mochawesome-report",
"cy:run-generate-report": "npm run delete:reports && npm run mochawesome:report || true && npm run mochawesome:merge",
"generate-report": "marge cypress/reports/mochawesome/report.json -f report -o cypress/reports/mochawesome",
}

There are two reporting types here, one that we add to our frontend pipeline and one that we run as a scheduled job on GitLab.

The step we added for the scheduled job is a job that works according to the most used browsers and versions of the sellers using the Trendyol seller panel. Here, we wanted to integrate Browser Stack into Cypress before, but it seemed impossible to overcome the auth mechanism for the Trendyol stage environment, so we developed an alternative.

When we run the scheduled job, we run the tests in different browsers and generate test reports. So here we run cy:run-generate-report.

//.gitlab-ci.yml
stage: Cross Browser Tests Report 📄
needs: []
<<: *only_except_skip_ci
script:
- npm ci --cache .npm --prefer-offline
- npm run cy:run-generate-report
- export SLACK_WEBHOOK_URL=https://hooks.slack.com/services/***
- npx cypress-slack-reporter --report-dir cypress/reports/mochawesome/mochawesome-report --ci-provider=custom --custom-url=http://*trendyol.com/-/*/*/*/-/jobs/$CI_JOB_ID/artifacts/ --custom-text="$CUSTOM_TEXT" --vcs-provider=none
artifacts:
expire_in: 1 day
when: always
paths:
- cypress/videos/**/*.mp4
- cypress/screenshots/**/*.png
- cypress/reports/mochawesome/**/*
Run Chrome109:
extends: .base
image: registry.trendyol.com/***
<<: *only_schedule_job
before_script:
- export BROWSER=chrome
- export CUSTOM_TEXT="Cross-browser testing in CHROME109"
Scheduled Job Pipeline Screen

In the job created for the pipeline, we get the report by merging the JSON output of each of the currently running parallel tests. Since we want a JSON output at the end of each test suite, we add artifacts to the step where we run the tests and then combine these four different JSON files with the merge command of mocha.

Here, we run our generate-report command.

Pipeline Screen
Slack Announcement
Test Report Example

Conclusion

As a result, by running tests in parallel, we managed to tackle the complexity of the structure mentioned in the introduction and enhanced the optimization of our test projects. With every new test scenario we added or updates to existing tests, parallel testing allowed us to execute more tests rapidly. As a result, we sped up our pipeline time by 1 minute or more.

In the context of reports, we established a structure that allowed us to merge every generated report, resulting in comprehensible reports, videos, and screen captures. This enabled us to detect UI errors and reach solutions faster and more easily.

Throughout our journey of parallelization and reporting using Cypress, we tried to narrate the process step by step and the challenges we encountered. We hope it proves useful to you. :)

Happy Testing!

Be a part of something great! Trendyol is currently hiring. Visit the pages below for more information and to apply.

--

--