Legacy Tools in Modern Stacks Part III: Using Docker For Local Tooling

Yann Boisclair-Roy
SSENSE-TECH
Published in
5 min readSep 5, 2019

*Click here for Part I and here for Part II

Image from Pixabay

With all the different microservices and applications running in different languages at SSENSE, it has become very natural to containerize everything using Docker. Nearly all of our projects are using it, and it has served us well so far. For those who have no idea what Docker is, I suggest reading their documentation where they explain what containerization is and how Docker handles it.

Docker is generally used to containerize and deploy applications on a container orchestrator like Kubernetes, Rancher, Docker Swarm, etc., but did you know that you can use it to deploy tools locally and bypass all the hassle related to setup?

Implementing a Documentation Compliance Tool

In the last year, SSENSE has been moving forward with establishing an InnerSource philosophy for all of its services. A primary paradigm of inner-sourcing is to have great documentation for every aspect of every service. With that in mind, we conducted some research to understand the elements of great documentation. We challenged ourselves internally and defined a documentation standard for all our GitHub repositories.

The next step was to develop a tool to use internally which could generate documentation skeletons for new services and check for missing docs in existing ones.

For our engineering department to adopt and use such a tool with minimal disruption, we needed:

  • A way for our engineers to use this tool without having to install libraries locally
  • An operating system agnostic tool
  • A way to ensure that our engineers could always use the latest version of the tool seamlessly.

A couple of tricks allowed us to use Docker for this.

First, we created a GitHub repository with all our normal settings and internal processes. In this repository, we started by creating the documentation skeleton with explanations for each part. As demonstrated in Part I of this series, we then created a Makefile with different commands, making it very easy to contribute to. We also added high-level commands like `make build` to quickly build the Docker container, `make shell` to instantly enter a working container, and `make review` to run all quality checks (unit tests, linting, code coverage, etc.).

Inside this repository, we developed two tools:

  • The Doc-Generator, designed to generate the documentation skeleton in any given repository which you have access to (either directly to Master or into a Pull Request). This is triggered by the `make create` command.
  • The Doc-Checker, which scans your current folder and evaluates which parts of the documentation within it are not up to standard. This is triggered by the `make inspect` command.

Note that to work with everything listed so far, the engineer need not ever have installed anything locally. They also need not know the language in which the tools were coded. All they need is Docker to run the code and Make (pre-installed in the container) to interface with the internal commands.

When someone contributes to the codebase, after passing through all the quality gates of our CI pipeline, a `:latest` version tag is pushed to DockerHub automatically.

Let’s See Some Commands

Alright, so you have a fancy documentation tool on DockerHub. How do you use it?

Assuming that you have access to DockerHub and that you are logged in, all you need to call is the following:

$ docker pull ssense/documentation-inspector && docker run -it — rm -v $(pwd):/code/target ssense/documentation-inspector make inspect

* Note that this image is private so you won’t actually be able to use this command.

Command breakdown:

  • First, we pull the latest version of our documentation inspector tool. That way, we always have the latest version.
  • Then we run a new container with the following parameters:
  • `-it` to make the process interactive and have a pseudo TTY to have access to STDIN and STDOUT
  • `-rm` to automatically remove the container once the process is done
  • `-v` to mount a volume between the container and the folder where this command is executed.
  • We then give it the Docker image to run. Note that if no version tag is provided, it will fetch the `:latest` tag automatically.
  • Finally we use the appropriate make command

The result of this command will list which documentation sections are missing on the terminal. This makes it easy for the engineer to bring their documentation up to standard.

But I know what you’re gonna say: “So you’re expecting me to write this kind of command every time I want to check my documentation?! I don’t want to remember all that!”

This is where the command from Part II of this series comes in handy. Let’s make an alias to shorten it!

$ alias doc-check=’docker pull ssense/documentation-inspector && docker run -it — rm -v $(pwd):/code/target ssense/documentation-inspector make inspect’

And there you go! Now we can simply type “doc-check” and it will grab the latest version of the documentation standards, and match it against the file and folder where the command is run.

Hard to be easier than that! By running just one small command, you can now fetch the latest version of a container and run it against your actual folder, without installing anything other than Docker.

Other Ideas

With some creativity, you can now use this pattern to build multiple other handy tools, such as:

  • Any stateless tools that you normally have to install locally but could simply run in Docker.
  • Any type of generator (code, configs, skeleton, etc.) to speed up development.
  • Any kind of enforcing tool for standards and quality. You can run them either locally or in your automated pipeline

As we come to conclude this three-part series, I hope you have learned a few useful tricks, and that it’s not always necessary to reinvent the wheel in our line of work. With just a little help from you, ‘old’ but efficient libraries and commands can age like fine wine.

Editorial reviews by Deanna Chow & Prateek Sanyal.

Want to work with us? Click here to see all open positions at SSENSE!

--

--