Adoption of Contract testing

Dmitry Ivanov
Picsart Engineering
9 min readApr 25, 2024

In today’s software development world where services and apps need to talk to each other smoothly, things get complex fast. The adoption of microservices has changed the way we are building and deploying backend applications. At the same time, the variety of client types — from mobile devices (iOS, Android) to web browsers, adds another layer of complexity for cross-service communications. At Picsart we’re all about crafting apps that look and work great, whether on your phone, tablet, or computer. But with so many different technologies and app versions floating around, not breaking anything and moving forward at the same time is a challenge.

Years ago, when both API consumer and producer code was bundled into a single application, ensuring compatibility was straightforward — API specifications could be as simple as method signatures, with compile-time checks providing immediate feedback to developers. However, when a boundary between consumer and producer is lifted to another level, such as REST API interface, this fast feedback is getting lost. This could lead to a lot of confusions, errors and communication gaps between teams, working on the different parts of the complex service-oriented application.

The API Contracts, or API Specifications, are the way to escape from this uncertainty in communications between two sides (client-server, producer-consumer, etc) and the key for removing cross teams dependencies and scale feature development. A single API specification is the foundation for clear and unambiguous communication between different parts of our system. However API Specifications itself is just a document, so we need a way to enforce checks by that document, in the same way, as it was done by compiler in the pre-service era of applications. This is a place where Contract Testing is a step into the process. Contract Testing is a technique used to verify that parts of any service-oriented architecture interact with each other (and with clients) correctly. It focuses on confirming that the API specifications are met and respected by both parties.

By adopting contract testing, we’ve found a solution that works not only for supporting various client applications, but also for our microservice backend, ensuring our development process remains agile and our applications stable. So, let’s take a closer look at the steps we have done to introduce Contract testing in our daily work.

Choosing the Right Tools

The journey to integrating contract testing within a team’s workflow begins with selecting the right tools. With our teams coming from a variety of tech backgrounds, it was crucial to pick tools that felt familiar and could easily fit into our current ways of working.

The criterias for this selection were clear: ease of integration, compatibility with existing workflows, and a minimal learning curve for the teams (both producers and consumers teams).

All of our team members were already familiar with OpenAPI specification, as we were using that to describe our REST APIs, and we started our search from investigation of possibility to use what we already have.

This led us to discovering Specmatic, a tool that could leverage the OpenAPI specification itself as the executable API contract. This meant we could use our existing API specifications to automatically generate and run tests with almost zero effort to start. While some other popular tool choices for contract testing like Pact or Spring Cloud Contract require additional learning effort from the team, OpenAPI specification is usually something you already have, and if you don’t, there are a bunch of tools which will auto generate OpenAPI specification for your APIs and you could use that specification as a base, without needing to write everything from scratch.

Given these advantages, we decided to give this combination of tools a try to make sure they met our needs.

Let’s take a look at an example of the API spec:

/orders:
get:
tags:
- orders
summary: Get list of orders
operationId: findOrderList
parameters:
- name: limit
in: query
required: false
schema:
maximum: 30
minimum: 1
type: integer
format: int32
examples:
GET_ORDERS_200:
value: 30
GET_ORDERS_400_MIN_LIMIT:
value: 0
GET_ORDERS_400_MAX_LIMIT:
value: 31
responses:
'200':
description: Successful response
content:
application/json:
schema:
$ref: '#/components/schemas/OrderListResponse'
examples:
GET_ORDERS_200:
value:
status: success
response:
'400':
description: Bad Request
content:
'application/json':
schema:
$ref: '#/components/schemas/FailedResponse'
examples:
GET_ORDERS_400_MIN_LIMIT:
value:
reason: "validation_failed"
message: "limit must be greater than or equal to 1"
status: error
GET_ORDERS_400_MAX_LIMIT:
value:
reason: "validation_failed"
message: "limit must be less than or equal to 30"
status: error

The above is a standard OpenAPI specification, enhanced with “examples” (see Adding Examples to OpenAPI, Example OpenAPI Object).

In this specification we see one REST API (GET /orders) with one parameter “limit”. In our experiments with Specmatic, we discovered that its ability to auto-generate test values, while useful, doesn’t always fit our needs. For example, when an API requires a specific kind of data (like an ID from our database), random strings just won’t cut it. That’s why we decided to define our test examples manually. This approach ensures our tests are both realistic and relevant.

In this specification we have 3 examples of “limit” parameter and those will be converted to 3 tests:
- GET_ORDERS_200 — this request expects a successful response
- GET_ORDERS_400_MIN_LIMIT, GET_ORDERS_400_MAX_LIMIT — these requests expect a 400 Bad Request response

This way, by enhancing specification with examples, we not only created useful documentation for our API, but also created 3 tests! Of course, writing test examples for an API with several parameters can get complex. It requires thinking through all the possible combinations of inputs. However, facing this complexity head-on can also be helpful. If we find our API requires too many parameters, it might be a sign that our API design is more complicated than it needs to be and we might need to revisit it one more time.

Implementation strategy

Beside technical stuff, like OpenAPI specifications and Specmatic, the next important step is to facilitate communications between teams. It’s essential for both API producer and consumer teams to come together, share insights, and make decisions collectively. We need to make sure that every stakeholder of future API implementation has a voice.

As specifications itself are plain text files (yaml in case of OpenAPI), we decided to store those in the git repository and we started to use the Merge Request review process to support collaboration: this means that any update to our API documentation is stored in the repository. Before any changes are made final, they need to be approved by the teams that are directly affected. This review process guarantees that everyone is on the same page, reducing the chances of surprises down the line. We also noticed that this process might provide different views from consumer and producer sides about the same API, and specifications review is a good time and place for such discussions.

Our setup recognizes two main types of contracts: Public and Internal. Each type of contract covers its own communication path. And for each type of contract we defined different sets of teams, which should be involved in the API spec change review process.

Having manually crafted specifications collected in a single central repository not only allows for contract testing but also enables other useful flows, which might be important in your organization, such as security or legal reviews of public APIs. For instance our developers see an increase in understanding normal Merge Requests: as code merge requests are sent to review along with API specification changes, it becomes clearly visible which exact parts of the application were changed and where the entry points for review are. So reviewers need less time to understand the impact of changes!

The test pyramid

Another important thing to decide is the place of contract tests in the test pyramid.

If we take a look at the classic pyramid, they should be in the “Service Tests” layer, but if we start thinking about the amount of contract tests vs integration tests, it could be hard to find the right balance from the beginning, because in reality — “it depends”.

The key is to use both contract and integration tests in a way that complements your testing strategy. Contract tests are great for getting fast feedback, which is invaluable for identifying issues early in the development process. On the other hand, integration tests are essential for validating complex workflows that contract tests might not fully cover.

Therefore, the aim is to develop a testing approach that is both efficient and comprehensive. By strategically combining contract tests for quick checks with integration tests for in-depth verification, you can ensure your project benefits from speedy development cycles without compromising on quality or coverage.

Technical challenges

Specmatic provides an easy to use JUnit TestFactory for JVM applications, which could be used to generate and run contract tests against your service.

class ContractTest : SpecmaticJUnitSupport() {
companion object {
@JvmStatic
@BeforeAll
fun setUpAll() {
System.setProperty("host", "localhost")
System.setProperty("port", "8080")
System.setProperty("testBaseURL", "http://localhost:8080/api")
System.setProperty("timeout", "600")
}
}
}

While JUnit is a testing framework from the Java (JVM) world, it still can be used to run tests against a service written in any language, as it is just a test generator and uses the provided host and port to connect to the end server.

At the same time, the selected approach with the test factory does not allow you to use @BeforeEach and @AfterEach methods to prepare your test data or to reset something after the test, which we usually do when using JUnit. To achieve the same behavior we need to implement an `org.junit.platform.launcher.TestExecutionListener` interface which provides methods like `executionStarted` and `executionFinished`. These methods accept one parameter of type `TestIdentifier` that contains details about each test case — including the name of the test scenario generated by Specmatic.

import org.junit.platform.launcher.TestExecutionListener
import org.junit.platform.launcher.TestIdentifier

private const val SCENARIO_PREFIX = " Scenario: " // prefix used by Specmatic

class MockServerExecutionListener : TestExecutionListener {

override fun executionStarted(testIdentifier: TestIdentifier) {
val testName = testIdentifier.displayName
if (!testName.startsWith(SCENARIO_PREFIX)) {
return
}
arrangeData()
arrangeOrders()
...
}
}

For organizations managing multiple services across various teams, the initial absence of an OpenAPI specification can be a barrier, because it’s hard to synchronize OpenAPI adoption on the scale. To overcome that situation, we used MockServer with handcrafted request and response expectations. And eventually handcrafted MockServer calls are subject of replacement with API responses generated from the OpenAPI specification, and there are few ways to achieve that:

To achieve needed API behavior, MockServer OpenAPI expectation could be configured in a flexible way, by providing mapping between operations, defined in specification, and response codes:

import org.junit.platform.launcher.TestExecutionListener
import org.junit.platform.launcher.TestIdentifier
import org.mockserver.client.MockServerClient
import org.mockserver.mock.OpenAPIExpectation

class MockServerExecutionListener : TestExecutionListener {

override fun executionStarted(testIdentifier: TestIdentifier) {
val testName = testIdentifier.displayName
if (!testName.startsWith(SCENARIO_PREFIX)) {
return
}

val mockServerClient = MockServerClient(...)
mockServerClient.reset()
mockServerClient.beforeEach(testName)
}

private fun MockServerClient.beforeEach(testName:String) {
val responseCode:String = responseForTest(testName)
upsert(
OpenAPIExpectation.openAPIExpectation(
"/internal-service.yaml",
mapOf(
"getObject" to responseCode,
"createObject" to responseCode,
"deleteObject" to "200",
)
)
)
}
}

Lessons Learned

After a year of embracing the contract-driven approach, we’ve witnessed firsthand how it can enhance teamwork and lead to better technical designs. However, embracing this approach hasn’t been without its challenges. And the most significant challenge has been shifting the team’s mindset to prioritize creating and maintaining API specifications from the beginning of a feature development. Finding and committing to tools that fit our team dynamics was crucial to making this shift possible. But besides the tools, the leadership plays a pivotal role in this transition. Team leads and architects must actively participate in specifications writing and API design discussions.

In conclusion, moving towards contract testing affects not only areas of testing and quality assurance, but the entire end to end development process: making it more collaborative, asynchronous, performant and less error prone at the same time.

--

--