Coding for rapid releases, don’t forget the basics!
Nordnet Tech teams are using a bunch of great tooling and open source projects such as GitOps, Terraform, ArgoCD, Kubernetes and Istio. Despite all of these different techniques, everything still boils down to the craft of writing great code.
As a Software Engineer at Nordnet, my mission is to build robust applications that can be deployed and released at any time.
The challenge then is: How do I make sure that my new endpoint, bug fix or rewrite didn’t come with side effects such as:
- Introducing another bug.
- Making my application crash.
- Breaking the API contract for the consumers of my application.
Therefore, at the heart of every application, we should always include a suite of Integration Tests (not to be confused with integrated tests).
Integration Test
- Makes sure that all the components in the application works together as expected.
- Starts the real application together with a running datastore.
- Runs by sending a
request
to an endpoint and verifies theresponse
back from the application.
Tests should
- Not have direct dependencies to the source code (Changes to source code does not mean I have to change my test code).
- Not have external dependencies to other services (Builds on my machine as well as in the CI/CD pipeline. Some tooling may be required but other apps should not be required to run for the test to work).
- Run fast for quick feedback.
- Verify the API contract used by client applications.
Today at Nordnet, our focus is to write good Integration Tests rather than Unit Tests for every class. This helps us to achieve the goal of not having direct dependencies to the source code. There are of course exceptions to this rule, but that is not the focus of this article. Read Testing of Microservices for more information.
In the Discover team, we are using Redis a lot as a cache layer. We also split read and write operations between different applications, which allows us to optimize read and write workloads and scale the apps independently.
Example app
One of our services at Nordnet provides company data such as links to homepages, names of CEOs and much more. This example consist of an application for serving the company data, stripped down to focus on the important parts.
The Distributed Application consists of
- Company application which can be scaled to multiple instances based on the current load.
- Loader application (not part of example) which takes care of third party integrations and loads data into Redis.
- Redis cache for fast data consumption, which can also be scaled with more read replicas.
Company Application
First we create a simple GET
endpoint to fetch a Company on its ID.
/**
* Endpoint for fetching company data
*/
@RestController
@RequestMapping(produces = MediaType.APPLICATION_JSON_VALUE)
@RequiredArgsConstructor
public class CompanyController {
private final CompanyRepository repository;
@GetMapping("/companies/{id}")
public Mono<ResponseEntity<Company>> get(@PathVariable CompanyId id) {
return repository.read(id)
.map(ResponseEntity::ok)
.switchIfEmpty(Mono.defer(() -> Mono.just(ResponseEntity.noContent().build())));
}
}
We use a Spring RestController
for creating our endpoint, accessible with a GET
request on the path: /companies/{id}
.
Next we are hiding the Redis integration in a Repository layer, where we are using Redis strings with a GET command to fetch our data. In spring-data-redis
that is called ReactiveValueOperations.
/**
* Repository for fetching company data in Redis
*/
@Slf4j
@Repository
@RequiredArgsConstructor
class CompanyRepository {
private final ReactiveValueOperations<String, String> stringValues;
private final ObjectMapper mapper;
Mono<Company> read(CompanyId id) {
return stringValues.get(id.getValue())
.mapNotNull(v -> {
try {
return mapper.readValue(v, Company.class);
} catch (JsonProcessingException e) {
log.error("Couldn't map json value", e);
}
return null;
});
}
}
The company-app has no state itself but is dependent on the contract of the data format stored in Redis by our loader application. The loader application has its own integration test suite which also creates a dump of the data stored in Redis. Here is an example: How to access the data dump running Redis in a docker container.
Hey, that is exactly what we need in our company-app for the integration tests! So, let’s start writing tests!
Test Implementation
- When running the test make sure to have docker started already.
- We are using Testcontainers and SpringBootTest for running our test.
@Testcontainers
@SpringBootTest(classes = {Application.class}, webEnvironment = RANDOM_PORT)
abstract class IntegrationTestSetup {
private static final String REDIS_IMG = "redis:6.2.7-alpine";
// This file is generated by or Loader application
private static final MountableFile REDIS_DUMP_PATH = MountableFile.forClasspathResource("redis/dump.rdb");
private static final String REDIS_CONTAINER_DUMP_PATH = "/data/dump.rdb";
private static final int REDIS_PORT = 6379;
@Container
static GenericContainer redis = new GenericContainer(DockerImageName.parse(REDIS_IMG))
.withExposedPorts(REDIS_PORT)
.withCopyFileToContainer(REDIS_DUMP_PATH, REDIS_CONTAINER_DUMP_PATH);
@DynamicPropertySource
static void setRedisProperties(DynamicPropertyRegistry registry) {
registry.add("spring.redis.host", () -> redis.getHost());
registry.add("spring.redis.port", () -> redis.getFirstMappedPort());
}
}
@Container
annotation starts our container from a redis image.@DynamicPropertySource
is used to solve the problem with dynamic properties. It's an easy way to set the correct port and host properties that change between the runs.- We also preload our Redis instance with a data dump created by the loader application (not part of the example).
The actual test is then based on the data already stored in Redis and running a GET
request for the Nordnet company, stored with the ID "nordnet"
.
@DisplayName("Test company-app")
class CompanyDataIntegrationTests extends IntegrationTestSetup {
@Autowired
private WebTestClient testClient;
@Test
@DisplayName("Get Nordnet company data on id")
void getNordnetCompanyData() throws Exception {
var result = testClient.get().uri("companies/nordnet")
.exchange()
.expectStatus().isOk()
.expectBody(String.class)
.returnResult()
.getResponseBody();
assertEquals(Path.of("nordnet.json"), result);
}
private void assertEquals(Path expectedJsonData, String actual) throws Exception {
final File file = Path.of("src/test/resources/expected").resolve(expectedJsonData).toFile();
final String expected = FileUtils.readFileToString(file, StandardCharsets.UTF_8);
JSONAssert.assertEquals(expected, actual, true);
}
}
We fetch the result and then compare it to nordnet.json
file:
{
"ceo" : "Lars-Ake Norling",
"email" : "info@nordnet.se",
"headquarter" : "Stockholm",
"introductionDate" : "2020-11-25",
"isinCode" : "SE0015192067",
"name" : "Nordnet",
"url" : "https://nordnetab.com/sv/"
}
- It looks easy and the test code is completely isolated from the source code.
- There is actually a lot going on! We make sure our application starts, connect to a real Redis instance and verifies the contract test to our company endpoint.
- If a change is made to the source code which breaks this contract our test fails, and we need to fix the code before we can do a commit.
- A catch like that, in this early phase of development, saves us tons of time compared to building and deploying the application and then running tests in some test/staging environment before going to production.
Conclusion
Let us head back to our checklist again and see if our test fulfill the requirements:
Our test
- Has no direct dependencies to the source code.
- Runs fast for quick feedback
- Has no external dependencies to other services.
- Verifies the API contract used by client applications with our endpoint test.
This is of course a simple example but could easily evolve together with your code as more advanced features and new endpoints are being built.
Last but not least, we fulfilled the checklist above and can now proudly run our tests with confidence and hand our code down the manufacturing-gitops-argo-terra-you-name-it-line for release to our customers :)