Running your local dev environment inside a container — VS Code Remote, GitHub Codespaces

Lewis Holmes
ASOS Tech Blog
Published in
10 min readDec 14, 2021

Over the past decade Containers have become a major trend in Software Engineering. This is unsurprising as they are more efficient than Virtual Machines, they help provide a consistent way to package and deploy applications whilst also offer increased portability so that they can be run consistently on a variety of different Operating Systems.

At ASOS we aim to run all of our services and applications inside containers whether they are hosted in Azure App Service or on Azure Kubernetes Services (AKS) and we have seen a number of benefits from this approach including better resource utilisation, improved reliability and resiliency, faster and more stable build and deploy pipelines and as a result of all of this, large cost savings. However, a question I’ve been asking myself more recently is how we could leverage the benefits of containers further by utilising them within our engineers’ local development environments.

You may be thinking why would you want to use containers for running your local development environment? Well, here are some common pain points I see our engineering teams facing that I believe could be improved by running our development environments within containers.

  • 👩🏾‍💻🧑🏻‍💻 Faster onboarding experience — ASOS is on a growth trajectory, and that means that our tech team is growing too, so we are always looking for great new tech talent (we’re hiring 😉). When new engineers start, we want them to be able to commit code to a project on the same day. To achieve that goal we need to automate the setup of their development environment as much as possible.
  • 🔁 Simplified and more consistent development environments — We have various combinations of automated scripts, and some manual steps for setting up engineers’ development environments, but these are difficult and time-consuming to maintain. It would be great to provide more consistent development environments which can be teared down and recreated quickly and simply.
  • 💻 Development environments that better mirror production — We run all of our containers from test environments to production on the Linux Operating System, but our engineers generally use Windows or MacOS for local development. We’d like to bring those environments closer together to minimise the risk of things not working as expected and also make our engineers feel more familiar with the platforms their code runs on in production.

With those aspirations in mind how can you go about utilising containers for your local environment setup? At ASOS Tech our engineers predominantly use Visual Studio and Visual Studio Code to develop software, so, I looked to see what offerings Microsoft provided in this area. Microsoft provides several tools to support this and they all come under a term known as ‘Remote Development’. Within Remote Development, there are three options available:

  1. Container(s)
  2. Remote Machine (SSH)
  3. Windows Subsystem for Linux (WSL)

I chose to primarily focus on option 1 Containers so far but all three options work in a similar way.

VS Code Remote Containers

When looking at using Containers for your development environment, there is an option called ‘VS Code Remote Containers’ which allows you to use one or more Docker containers to provide a full development environment that you can interact with using VS Code. You can specify exactly what tools, runtimes and VS Code Extensions you want to have available in the development environment alongside the code repo you are working on. With this approach you could configure an environment for typical C# back-end development with .NET SDK, Azure CLI and some of your favourite VS Code Extensions like a Test Explorer. However, for another project that perhaps requires a persistence layer to be running locally you could setup another development environment inside a container that also specifies an additional container to run a SQL Server 2019 database or the CosmosDB Emulator. The possibilities are endless!

Once these environments are setup inside a container they only run (and use local CPU and memory resources) while you are connected to them, plus switching between these fully isolated environments is a simple as attaching to another container and just takes a few seconds. If anything ever goes wrong with one of these environments, you can easily get back to a good state by re-building the container(s). This can be done very quickly with a single command and is fully automated. When re-building the environment, what happens under the hood is that the Docker image for the environment is re-built, a new container is created from that Docker image and then VS Code sets up the environment inside the container and attaches your local VS Code to it.

How does this all work?

VS Code Remote Containers works via a client-server model where your local instance of VS Code communicates to a component called ‘VS Code Server’ that is running inside the container. The VS Code Server is seamlessly installed inside the container for you the first time you connect.

Your source code can either be pulled down and stored directly inside the container or you can mount a code repo residing on your local filesystem into the container. It may feel more familiar to still have the code repo present locally on your machine and mounted into the container while you are trying out remote containers initially. This is the approach I took so that I always had the option to open the code repo without using remote containers and work as I previously did.

When debugging your application, specific ports are opened automatically for you and mapped between your local machine and the container. This allows you to debug the running application as if it was running on the local Operating System.

Remote Development with Containers architecture

Getting Started

To get started with the VS Code Remote Containers you need to install VS Code, Docker Desktop and the VS Code Remote Containers extension on your local machine.

Once you have the software setup, there are many samples and tutorials provided by Microsoft to help you get started. Firstly I tried this sample repository tutorial which uses a very simple ‘Hello World’ style Node.js application. This tutorial can be completed in around 10 mins and gives you a good taster for how the process works and how you can configure your development environment inside the container.

Next I wanted to try this with a project closer to what I’d normally be working on like a Web API with .NET and C#. Therefore, I created a new Web API project using the dotnet new webapi template, opened it up in VS Code and then set it up to support development inside a container. Remote Containers support ‘configuration as code’ and so to specify what software and tools you want available in your development environment you specify this as configuration inside a .devcontainer folder and commit it to your code repo. As the configuration is stored in your repo it allows anyone else working on the project to be able to use Remote Containers too and they can even contribute changes to the configuration.

To add the configuration to my repo I opened the command palette in VS Code and ran the command called ‘Remote Containers: Add Development Container Configuration Files’. You then get an option to select the definition to use. As I was planning to work with .NET and C# I chose the definition named “C# (.NET)”. You also get options to add things like Node or the Azure CLI to the environment. Once this command was executed it added a new .devcontainerfolder to my project with a number of files included.

The 2 main files to note are:

  • devcontainer.json — used to define settings for the container, VS Code Extensions to install, ports to open etc.
  • Dockerfile — used to tell Docker how to build the image to be used for the container environment. You could modify this to add additional OS packages to the container if required.
Adding remote container configuration to an existing project

Here is a sample of what the devcontainer.json looked like:

{
"name": "C# (.NET) Envioronment",
"build": {
"dockerfile": "Dockerfile",
"args": {
"VARIANT": "5.0", // .NET Version to use
"INSTALL_AZURE_CLI": "true"
}
},
// Container specific settings.json values on container create
"settings": {},
// VS Code extension to be installed when container is created
"extensions": [
"ms-dotnettools.csharp"
],

Now the configuration had been created, I opened the command palette in VS Code and ran the command named “Remote-Containers: Rebuild and Reopen in Container”. Once this was executed, VS Code started building my development environment inside a container. Once the Docker image was built and a container was created, VS Code attached into the container and I was ready to go. I was able to view/edit the files in my repo using Git, had access to a terminal and command line tools like the Azure CLI and I could also run and debug my application (all running inside my development container!).

What I Learnt?

Here are the six key things I learnt from trying out Remote Development with Containers:

  1. It’s really quick and easy to get up and running with minimal setup/software to install. The environment only takes a few minutes to be created the first time and tearing down/re-building the environment is even faster and consistently repeatable.
  2. There are a large number of development environment definitions already provided by Microsoft for most types of environment like C#, F#, Azure Functions, Node, Swift, Go, Python, Java, Ruby and more! These are a great starting point and can easily be modified further for your particular needs. I even used one of these definitions to help setup a multi-container environment including a .NET application and a SQL Server 2019 Database instance running inside a separate container.
  3. You may wonder how to handle secrets with this approach. I ended up using the dotnet user-secrets commands from the terminal available inside the container environment to setup user-secrets as a one off. However, going further I would like to fully automate this step so no manual steps would be required to setup the environment.
  4. As long as your runtime and tools support Linux then you should be able to adopt this approach. One tool our engineers use a lot at ASOS Tech is the Cosmos DB Emulator which traditionally only supported Windows but this is now in preview for Linux Container support which is great news!
  5. Running a development environment inside a Docker container locally can use quite a lot of system resources, so depending on the spec of your machine you may need to spend some time adjusting your Docker configuration to optimise the experience.
  6. You are currently limited to using VS Code for interacting with the development environment and so many people may miss using Visual Studio. However, during this process I spent some time customising my VS Code and installing specific extensions which has got me extremely close to the development experience I was used to which included Visual Studio, ReSharper and NCrunch (more to come on this in the future). This really surprised me and I ended up starting to prefer VS Code over Visual Studio!

Taking it to the next level with GitHub Codespaces

While experimenting with Remote Containers I came across GitHub Codespaces. Codespaces builds upon the idea of running a development environment inside a container but takes things to the next level as now the container runs on a Virtual Machine (VM) in Azure rather than on your local machine. You get a dedicated Linux VM running your development environment and you can choose various different levels of performance from a machine with two cores ranging all the way up to 32 cores for some serious performance. You only pay for what you use and any Codespace automatically shuts down if left idle for more than 30 minutes.

Codespaces uses the same configuration as Remote Containers and the .devcontainer configuration described earlier and so once you have a remote container working locally with Docker (and your repo is stored in GitHub), you can easily try out Codespaces with the click of a few buttons. The extra impressive thing about Codespaces is that you don’t even need to interact with the environment using VS Code instance locally as you also have the option to run VS Code fully in a Web Browser! More importantly, moving your development environment to a Linux VM in Azure now means that your local laptop performance doesn’t matter any more. You could be using a cheaper, lower spec laptop as all the heavy lifting has been shifted to a VM in the cloud. Codespaces is something ASOS Tech are really excited about and are considering adopting.

Launching a Codespace from a Web Browser

I personally feel that remote development using VMs in the cloud is really the future of how engineers will be developing code. In the same way as organisations have moved their applications to be running inside containers and hosted in the cloud, the same revolution is happening now for local development environments.

Have you tried out VS Code Remote Containers yet? What did you think? How about GitHub Codespaces too? If you haven’t yet, I encourage you to give it a try using the tutorials and resources below.

Useful Resources

VS Code Remote

GitHub Codespaces

Lewis is a Principal Software Engineer at ASOS. In his spare time he aspires to be more than a bedroom DJ, loves exploring nature with his family and jumps at any chance to play with Lego.

Did you know that ASOS are hiring across a range of roles in Tech? See our open positions here.

--

--

Lewis Holmes
ASOS Tech Blog

Lewis Holmes is a Principal Software Engineer, using his skills and expertise to support teams to build high quality software and continuously improve.