Microservices sources auto-update

Space307
Space307

--

Migration from a monolith to microservices

We use the microservice architecture in the backend of our project at Space307. Each microservice is responsible for some part of the platform logic. However, it wasn’t always that way. Five years ago, the backend consisted of a single monolith written in PHP.

Over time, the company’s structure changed, and we switched to development in cross-functional teams. From this point on, working with a monolith has become more difficult because:

  • Different teams make a lot of changes every day, have a lot of Merge Requests, therefore there were conflicts in the code.
  • With every deployment, the entire project is restarted.
  • In case of an error in the new code, the entire product may fall.

Following the team separation into cross-functional units, we had to change the approach to developing the backend, and the microservice architecture was the best fit. In the case of a microservice architecture, each team is responsible for its services, minimally overlapping with the area of responsibility of other teams. When properly designed, services are loosely coupled, increasing the entire project’s reliability.

Microservice architecture is not a silver bullet and contains many disadvantages. For example, if you search “microservices cons,” you can find thousands of articles. Since this topic is vast, I want to focus on one specific problem of microservice architecture and describe how we are trying to solve it at Space307: how to maintain and update dependencies in the source code of hundreds of microservices.

How to Update Hundreds of Microservices

First, I want to give more context about our backend. Today, we have about 200 microservices in the project. All microservices are written in Golang and stored in separate git repositories. Each microservice uses a self-written internal framework and self-written libraries for integration with our infrastructure. Both the framework and libraries are constantly updated, and we want to ensure that these dependencies are regularly updated in all microservices. Also, there is a standard template for each microservice. This template changes regularly, and we want to update it across all existing repositories. Finally, there is the most important problem: how to do it?

The first solution that comes to mind is to update dependencies and make changes to the template manually. The following algorithm is used for each service:

1) Create a branch other than the main.
2) Update specific dependencies, make changes to the project template.
3) Push changes and create an MR (Merge Request).
4) Get approval from teammates and wait for the successful completion of CI.
5) Merge into the main branch and deploy.

For each service, such an algorithm takes 30 minutes. If we have 200 services, then updating all services will take 200 * 30 = 6000 minutes or 100 hours, which is very expensive. Considering that it is more critical for business development to spend time on products and not on routine work, we decide to automate this somehow.

Developing a Tool for Auto-updating Sources

We need a utility to update the microservices’ sources and create MRs. Ready-made solutions are not suitable for us because, in addition to dependencies, we need to update service templates and do other additional actions specific to our project.

We didn’t want to spend a lot of time developing this utility and decided that an MVP with the following functionality would be enough to start with:

  • Parsing repositories of all services.
  • Updating dependencies, templates, and documentation for each service.
  • Creating a Merge Request in every repository.

We do not automate merge and deployment because we have a formal process. When deploying changes, an engineer must monitor the state of the service.

However, the current functionality covers most of the routine work because the MR merge and deployment take around 3 minutes. That is, for 200 services, it will take 3 * 200 = 600 minutes or 10 hours. Based on a set of requirements, a single engineer can develop a utility in approximately 40 hours. The first update will take 40+10 = 50 hours, and each subsequent update will take 10 hours. With a manual approach, each update takes 100 hours, two times longer than the first iteration and ten times longer than each subsequent update.

We calculated that the tool’s implementation paid off already at the first use and compiled a set of requirements. After that, we could start the actual development.

Next, I will provide the utility’s code written in Go with comments. The utility is a CLI application. When we run the tool, it executes the following algorithm:

```gofunc (c *cmdRepoDepsUpd) run() error {outputFile := c.ctx.Path("output")// We write to memory the already processed repositories for idempotency// in the case when the repository is presented in the result file.if err := c.fillProceedRepos(outputFile); err != nil {...}// We get the most updated version of the Makefile.if err = c.receiveLatestMakefile(); err != nil {...}// We get the most updated version of Go.if err = c.receiveLatestGoVersion(); err != nil {...}// We get the most updated version of the framework.if err = c.receiveLatestFrameworkGoModVersion(); err != nil {...}closeCSV, err := c.createOutputCsvWriter(outputFile)if err != nil {...}defer closeCSV()// We get the list of microservices' repositories.repos, err := c.getRepos()if err != nil {...}// We iterate through each repository and execute the update process.for _, r := range repos {if err = c.processRepo(r); err != nil {...}}return nil}```

After that, the process of updating the sources is performed for each repository. As a result, the algorithm’s architecture looks like this:

```gofunc (c *cmdRepoDepsUpd) processRepo(repo gitv1.Repository) error {var err error// We skip the proceed repository.if _, ok := c.proceedRepos[repo.Slug]; ok {...}// If the repository already has an open PR for an update, then we skip it.if c.findAlreadyOpenedPR(repo) {}...// We update the Go version in the service.err = c.updateGoVersionFile(repoPath)if err != nil {...}// We replace the public docker images with our local versions — for instance,// RabbitMQ, Kafka, etc.err = c.fixDockerCompose(repoPath)if err != nil {...}// We update the servcice's Makefile.err = c.updateMakefile(repoPath, repo.Slug)if err != nil {...}// We update the Swagger documentation generator's version.err = c.updateAppDoxelerator(repoPath, repo.Slug)if err != nil {...}// We regenerate the Swagger documentation.err = c.callMakeCodedocs(repoPath)if err != nil {...}// We update all Go-dependencies.err = c.updateAllGoMods(repoPath, repo.Slug)if err != nil {...}// We run a linter in auto-fix mode.err = c.callMakeLintFix(repoPath)if err != nil {...}// We run the service CI process locally to ensure everything works fine// after the update.err = c.callMakeCI(repoPath, false)if err != nil {...}// We push the changes into the repository.err = c.pushChanges(repoPath)if err != nil {...}// We create an MR into the main.pr, err := c.createPR(repo.Slug)if err != nil {...}// We write a new row about the service into the resulting CSV file.if wErr := c.outCsv.Write(...); wErr != nil {...}c.outCsv.Flush()return nil}```We can run the tool using the command:```bashISSUEID=82 bin/bbtools -u v.gukasov -p {git_token} repos depsupd -p {project_name} -o result.csv --title "JIRA-$ISSUEID Automatic update of dependencies" --branch "feature/JIRA-$ISSUEID-update-dependencies" -i```

How It Works Today and Things to Improve

Once launched, the utility updates the microservices source code and creates the Merge Requests. Every microservice has Code Owners, which are automatically added to an MR. After that, each team merges the MR, deploys the changes, and monitors the state of the service.

Currently, we use the utility to update the sources every 1–2 months. As we calculated above, it saves us at least 90 hours of work per usage. However, there are still points for growth.

The first point of growth is the manual work during the merge and deployment of changes. Theoretically, it would be possible to set up automatic merge and deployment. This would have saved the remaining 10 hours of work, but there is a caveat here in that updating a dependency can break the service. We have a CI process, so we know that the service can compile and the tests pass, but there may be errors in places not covered by the tests. Also, some errors occur implicitly at runtime. In this case, it might be worth setting up an auto rollback in case of an unhealthy service state after deployment, but this may not always work.

The second point of growth is a deeper analysis of the code. At this point, the process of parsing and updating looks rather clumsy, using string checks and regular expressions. As a result, we can’t update some parts of the service code. For example, if we implemented AST code analysis, we could update the source code of the services themselves including working with the infrastructure and changing function calls.

Conclusions

  • Microservice architecture is necessary when the company grows in cross-functional teams, and you will always have different problems you need to solve.
  • With an increase in the number of services, manual support for updating becomes more expensive and it is cheaper to spend time on automation at some point.
  • Do not immediately build the best/most comprehensive solution if the MVP covers most problems now.
Vladislav Gukasov, Backend Developer at Space307

--

--

Space307
Space307

We are Space307, an international full-service FinTech company. Our team is more than 350 software development and marketing experts.