Low-maintenance deployment pipelines
For DevOps teams that don’t want to spend a lot of time maintaining their deployment pipeline
As a software developer these days, you have to use more and more tools that are used in your deployment pipeline. You have to be proficient in version control systems like Git, Mercurial or Subversion, CI servers like Jenkins or TFS, SonarQube for static code analysis and deployment and infrastructure tools like Docker, Ansible, Chef and Puppet. And if you’re using the Cloud add AWS and Azure to that list. Of course this list isn’t complete, but I think it’s clear that the number of tools is quite high.
Using the tools is one thing, but those tools also have to be installed, configured and maintained. For example, if you want to use SonarQube, you have to install Java, install a database like MySQL or Postgres and finally install SonarQube. Then you have to add a database, a database user and configure SonarQube to use the credentials of that user. And if you want to use it in a secure way, add configuring SSL certificates and/or a reverse proxy server to that.
If you have a small DevOps team and are not supported by a team that does the maintenance for you, you have to add this to your list of responsibilities as well. These activities or choirs are perhaps not the things that you as a team want to spend a lot of time on. So which possibilities are there to minimize time spend on this?
SaaS / Cloud
One easy way of avoiding management on tools used in your pipeline is by not installing them yourself on a (virtual) machine, but using Saas or Cloud versions of them. Examples of tools that are available in a SaaS/Cloud version are Visual Studio Team Services (Cloud version of Microsofts’ TFS), Jenkins in the Cloud (by Cloudbees), GitHub and Artifactory SaaS. Big advantages are that you don’t have to spend time on installing or updating/upgrading, you always run the latest version, it scales nicely with your needs and you pay for usage only. Downsides are that you don’t have much control over your data (storage location can be an issue when doing projects for governments & banks) and that connectivity to on-premise servers can be harder than having your complete pipeline on-premise.
If you want or need to have your deployment pipeline installed on your own servers, you can use Docker to avoid dealing with dependencies of tools you want to use. A lot of tools used in a deployment pipeline have an official image in Docker Hub that you can use to get it up and running in no-time. To take installing SonarQube mentioned in the beginning of this blog as an example, with docker you create a simple docker-compose file using the “sonarqube” image and an image of a database of your choice and you’ll be up and running in no-time. This setup also allows for easy scaling and upgrades and backups when using volume containers. Keeping state (data) out of your container is a Docker best practice, and that also applies in this situation.
So if you find yourself spending too much time on keeping your tools and infrastructure running, try and see if you perhaps can use a SaaS solution or Docker to minimize time spent on these kinds of activities. That way you’ll have more time for something more important: creating and delivering great software.