Sitri = Vault + Pydantic: continuation of the saga, local development.

Alexander Lavrov
Dec 15, 2020 · 3 min read

Background

In the previous article I wrote about how to configure your application using Sitri, however, I missed the point with local development, since you will agree that it is not very convenient deploying a Vault locally, and storing a local config in a common Vault, especially if several people are working on a project, is doubly inconvenient.

In Sitri this problem is solved quite simply — using the local mode for your settings classes, that is, you do not even have to rewrite anything or duplicate code, and the structure json file for local mode will almost completely repeat the structure of secrets.

So, well, now, let’s add literally a couple of lines of code to our project + I’ll show you how you can work with all this if your project is run locally in docker-compose…

Preparing the code

To begin with, let’s agree that local_mode is true when ENV = “local” :)

Next, I propose to slightly edit our provider_config.py and create a BaseConfig class there from which we will inherit in our Config settings classes. We will do this in order not to duplicate the code, that is, the settings classes themselves will contain only what is specific to them.

A bit about local_provider_args in this field we specify the arguments for creating an instance of JsonConfigProvider, they will be validated and this dictionary must match the schema, so don’t worry — this is not some dirty trick. However, if you want to create an instance of the local provider yourself, then you just put it in the optional local_provider field.

Now, we can easily inherit the config classes from the base one. For example, a settings class for connecting to Kafka would look like this:

As you can see, the required changes are minimal. Local_mode_path_prefix we specify that the structure of the general config is saved in our json file. Now, let’s write this json-file for the local configuration:

{
"db":
{
"host": "testhost",
"password": "testpassword",
"port": 1234,
"user": "testuser"
},
"faust":
{
"agents":
{
"X":
{
"concurrency": 2,
"partitions": 5
}
},
"app_name": "superapp-workers",
"default_concurrency": 5,
"default_partitions_count": 10
},
"kafka":
{
"auth_data":
{
"password": "testpassword",
"username": "testuser"
},
"brokers": "kafka://test",
"mechanism": "SASL_PLAINTEXT"
}
}

… Well, or just copy and paste it from the end of the last article. As you can see, everything is very simple here. For our further research, rename main.py in the root of the project to __main__.py so that you can run the package with a command from docker-compose.

Put the application into the container and enjoy the build

The first thing we should do is write a small Dockerfile:

Here we just install the dependencies and that’s it, since it is for local development, we do not copy the project code.

Next, we need an env file with the variables required for local-mode:

As you can see, nothing superfluous, no configuration information is needed for Vault, since in local mode, the application will not even try to “knock” on Vault.

And the last thing we need to write is the docker-compose.yml file itself:

Everything is simple here too. We put our json file in the root, as write above in the environment variable for the container.

Now, launch:

docker-compose up

As you can see, everything started successfully and the information from our json file successfully passed all checks and became settings for the local version of the application, yuhhu!

Code of this “continuation” I put in a separate branch of the repository, so you can take a look at how it all looks after the changes: branch

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data…

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Alexander Lavrov

Written by

Backend developer from Saint Petersburg. I work and contribute to open source projects. Write me :)

Analytics Vidhya

Analytics Vidhya is a community of Analytics and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com