How we do testing at Juvoxa — Part 1

Samkit Jain
instigence
Published in
6 min readSep 15, 2021
https://xkcd.com/1700/

Something that is untested is broken.

This is part 1 of our multi-part series on how we do product testing at Juvoxa. In this part, we will be discussing the API testing process.

This is how testing was done at Juvoxa

  1. Code
  2. Deploy
  3. Test
  4. Bug? Go to step 1

Sounds like something you also do? Test Driven Development (TDD) is usually sacrificed in favour of faster and regular deployments. This is a cost that many chose to bear and later regret once it goes out of control. Yes, it is understandable to make that tradeoff at the start as manpower, resources and time are limited but keeping it in check is more crucial. In this article, we’ll be sharing how we do API testing at Juvoxa and hope that our practices will help you recover the tradeoff cost.

Our stack involves Python | Flask | Postgres. If your stack covers the same, read on.

Following is a bit theoretical and if you want to skip right to the fun coding part, skip a section below.

Manual vs Automated

Wherever possible, stick with automating your testing process as much as possible. With manual, you rely on the developer or the tester to test out the functionality properly which is time-consuming and prone to bugs. Plus, you need to ensure that all the edge cases are covered. Automated testing with TDD saves the developer time with the process being fully automated.

  • No unintended consequences as bugs are caught earlier on.
  • A faster transition from the development phase to the QA phase.
  • Reproducible user behaviour with integration testing.
  • An extra hour writing the tests today saves us an extra day tomorrow.
  • Overall more reliable and stable API system.
  • No more (okay, fewer) XKCD #1739 events.
  • Boosts developer confidence.

Minimal Requirements

The testing setup should have the following at the least

  • Cover both the positive (API endpoint returning HTTP 2XX response) and negative (API endpoint returning non-HTTP 2XX response) workflows.
  • Validate not just the HTTP status code, but also the response payload.
  • For a Pull Request (PR) to be deemed ready for review, it must have a test that passes with the code changes made in the PR but fails without them. The code coverage should improve or at the least remain the same. But should never decrease.
  • In the CI/CD pipeline, all the tests must pass before the changes can be deployed.

API Testing Strategies

Unit Testing
This involves testing a single API endpoint. The tests for that single endpoint run various simulations including both positive and negative workflows and validate that the endpoint is working as expected or not. For example, testing the login API endpoint with the correct credentials.

Integration testing
This involves running multiple API endpoints in a single test. This is useful for simulating user scenarios and testing API endpoints that are dependent on each other. For example, simulating the flow in which a doctor is creating a program on the platform which involves multiple endpoints to create a base program, add contents, add tags, etc.

Coding Time

* Assuming you already have a Flask app created and use Docker.

Before the tests can be run, you need to create an ephemeral database that lives for as long as your tests are run. You don’t want to be running the tests on the production database.

To start a sample PostgreSQL database, you can use the official Docker image like so (if you are using a different database backend then you can use a different Docker image)

$ docker run -d — rm -P -p 127.0.0.1:5432:5432 -e POSTGRES_DB=db_name -e POSTGRES_USER=username -e POSTGRES_PASSWORD=password — name postgres-container postgres:13.1

The above will start a PostgreSQL 13.1 server and keep it running in the background and also create an empty database by the name db_name that can be accessible by the Postgres user username with the password password. The database can be accessed over localhost at port 5432. Equivalent connection string postgresql://username:password@localhost:5432/db_name.

Setup your testing directory as

tests/
|-- __init__.py
|-- api
| |-- test_00_signup_api.py
| `-- test_01_signin_api.py
`-- conftest.py

Tests are run in alphabetical order, so if you want your tests to be run in a certain order, prefix them with 00, 01, 02, …
conftest.py — Configuration file where you can register hooks.

It is better to organise your test files in the api/ folder and have a separate file for each API.

Notice that even in the individual testing files, using the method name with 00, 01 as a suffix to ensure that the tests are run always in the required order. You can leverage the testing order to also populate the same database on the go and don’t need to rely on having a prepopulated database (though that is a choice as well).

To run the tests, first, install pytest

$ python -m pip install pytest  # Required only once
$ python -m pytest tests/

If everything is ok, you should be seeing a green tick saying all tests have passed.

If you want to run only a subset of tests, you can use the -k option as provided by pytest. More details can be found here.

The above was a very simple example that will help you get started with integrating testing in your API development process. You can extend the test cases above to suit your needs. The above example demonstrated unit testing the APIs. To create scenarios and perform integration testing, you can update the tests to call multiple APIs in sequence and validate each (for example, simulating a flow where a user registers but does not verify the email and then tries to login and perform restricted actions).

CI/CD

We use AWS CodeBuild for our CI/CD flow. A sample buildspec.yml file to run the tests as part of your CI pipeline will look like

Code Coverage

Code coverage is a means to measure how much of your code is executed when running the tests. The higher the coverage, the better. There are multiple coverage report generators available that go along with pytest with the most popular ones being pytest-cov and Coverage.py.

We used pytest-cov and to use it, you just need to install it via pip and pass the required parameters when running the tests.

$ python -m pip install pytest-cov
$ python -m pytest --cov=./ --cov-report=xml tests/

To visualise the coverage in a better way and include it in your CI setup you need to use a service like Codecov. Create an account and authenticate GitHub. Once done, you just need to upload the coverage.xml report generated above to Codecov and let it handle the rest. For private repos, you can do so using

$ bash codecov_upload.sh

where codecov_upload.sh has the content as

#!/bin/bash
bash <(curl -s https://codecov.io/bash) -t YOUR-CODECOV-TOKEN

The buildspec.yml with the Codecov will now look

Tests should not be considered as a burden but rather as a means to empower the team to build better systems. They give you confidence when adding new features and allow you to scale your development with confidence as the codebase grows.

In the upcoming articles, we will be discussing how we test our frontend. Follow us to not miss out on any new articles.

Think you can help us improve the testing process? Come join us! Juvoxa is hiring. Send an email to hr@juvoxa.com.

--

--

Samkit Jain
instigence

samkitjain.com | Solutions Architect - Data & DevOps at FoodMarketHub