Enterprise DevOps — Configuration Management
My current customer has what we call “configuration sprawl”: many types of configuration supporting many different environments living in many different places. They have hundreds of environments supporting a suite of applications, each supported by a diverse collection of infrastructure and release tools. The configuration supporting these environments is being duplicated across components and stored in many different ways which means that updating something as simple as a server credential is extremely risky. Below is a high-level overview of the problem and how I’m tackling it.
My initial goal was small: Automate smoketests. The customer was manually running a series of Visual Studio webtests to smoketest their applications post-deployment before progressing a built to the next stage in their deployment pipeline. Environment-specific parameters were manually provided to these smoketests at runtime, most of which were being sourced from a version-controlled excel spreadsheet, a few csv files and some other sources. This was an error-prone and time-consuming process, so in the spirit of DevOps I wanted to see if the pain points surrounding smoketest automation were a symptom of a larger problem. Their “configuration story” looked something like this:
- Infrastructure: ARM templates were being used with environment-specific json configuration files checked into source control.
- Deployments: The customer is using a combination of VSTS Release (vNext) and Release Management to orchestrate their deployments. Deployment configuration was being stored within (and duplicated between) these tools.
- Application: Application configuration was managed through a combination of environment-specific .config files and transformations within application code that were applied at build. This means that if config changes were needed, rebuilds would be needed.
- Testing: Configuration was being stored in a combination of Excel and CSV files that were used to manually drive the webtests.
The problems are numerous here:
- Environment-specific configuration was stored in source-controlled ARM templates, in two different release tools (vNext and Release Management), in excel, in CSVs, in source-controlled environment-specific config files and in transforms. This is a serious level of sprawl.
- Configuration was being duplicated between these locations.
- Changing configuration usually meant that both a rebuild and redeploy were needed.
The goals became:
- Centralize config to address sprawl and duplication.
- Standardize the application of this config across Infrastructure, Deployments, Code, and Test.
I don’t know if there’s an industry-standard approach to this problem, but here’s my train of thought:
- Since we have two different Release tools (Release Management for legacy apps and VSTS Release for newer stuff) we can’t store the configuration in the tool. This would require a duplication of efforts and configuration. Even if we had a single tool, this customer has a high number of environments with hundreds of configuration entries each. The VSTS Release UI is great for a smaller sets of configuration but isn’t intended to manage this volume.
- Since we have configuration needs that span from code to deployments to infrastructure and test, source-controlled config-per-environment would get very messy very quickly.
- If config can’t be stored in the release tools, and it can’t be stored in source control, then we need to store it in a custom, centralized config store. Table storage or SQL for example.
- This also means that we need some standardized way to extract config from this config store at deploy time and apply it to whatever files require configuration. Things like appsettings.json, web.config, app.configs, ARM json files, etc.
- The “hydration” of config should not be an “consumer” concern. This needs to be a release concern. This way we are not imposing additional overhead into the components that require configuration. For example — instead of having the smoketest tool pull config via a data source binding or HttpClient plugin at runtime — have the release tool inject config into a settings file at deploy time.
The initial plan is as follows:
- Starting with the smoketest tool, move the necessary configuration away from excel/CSV and store it in a config store that lives behind an API.
- Build a basic CRUD app to manage this configuration via a hosted GUI with secured API access.
- Use a `settings.xml` file to store environment-specific configuration alongside the smoketest project. Use a corresponding `settings.xml.token` file with tokens that the release tool injects configuration into at deploy time.
- Create a custom release task (a Powershell script that runs a custom EXE) that requests configuration key/value pairs from the config store API and then hydrates the target token files, converting them into usable configuration files.
- Gradually move configuration from all applications and environments into this tool, create `.token` files, and extend the capabilities of this solution to suit additional components as needed.
This seems appealing to me because it means we’re standardized, centralized and tooling-agnostic. We can use TFS Release, Release Management, Jenkins, whatever. I see this as a form of dependency injection: we’re decoupling the config from the build and injecting it as a dependency somewhere along the build/release pipeline.
One issue with this approach is that we will have .config and .token files living next to each other; once a developer makes a change to the structure of their .config file, they also need to make the same change to the corresponding .token file. Given how infrequently config structure usually changes, I think this can be addressed by forcing a peer review or dev lead approval when making changes to .config files.
Creating the Configuration Store
The MVP approach to the configuration store looks something like this:
- An Azure SQL database with two tables: Environments and ConfigurationEntries. One-to-many there.
- A barebones CRUD MVC app to administer these.
A utility was then written to move the excel data to the database. Throw an API endpoint on top and we’re ready to automate smoketesting.
Creating a Custom Release Task
We don’t want the application or test code to be responsible for pulling config. The release tooling should push config into the target applications.
Since we’re now using a custom config store, we can’t use the OOB release tasks. We need to create a custom task that:
- Requests config from the config store for a particular environment
- Hydrates token files with these values and transforms them into config files
With this in place, we’ve now centralized config and standardized its consumption, reducing risk and enabling automation in the process.
Here’s the before picture:
And here’s the after:
Much simpler — config has been centralized and it’s applied at deploy time. The overall result here is cost savings in the form of risk reduction and automation.