Remote Desktop, Continuously Delivered
A deep dive of my remote desktop environment with Chrome Remote Desktop, Packer, and Terraform.
I delivered a talk at DevOpsDays Philadelphia in 2019 outlining it briefly but a few folks requested a walk through of how I constructed it. This looks at the demo behind the talk. If you’re interested in slides for the talk, I have them on SpeakerDeck. Here are the tools at-a-glance:
- Public Cloud Provider: Google Compute Platform
- Pipeline: CircleCI, Terraform Cloud
- Image Creation: Packer, Vagrant
- Remote Desktop: Chrome Remote Desktop
Why did I do this?
When I became a traveller a few years ago, I often had to carry two laptops, a mobile phone as a WiFi hot spot, a tablet to watch movies, and a battery pack. Not only was it irritating to remove all of my devices in airport security but also painful for my back. My doctor recommended I get a much smaller laptop to tote around but it was an expensive option and quite frankly, I had enough electronic devices that I didn’t want to add to the collection.
Instead, I looked for a way to minimize my devices by getting a remote desktop development environment on my tablet. Many of the virtual desktop technologies I looked at required licenses. I stumbled upon Chrome Remote Desktop, a remote desktop software tool from Google. I could remote into another laptop or a Google Cloud Platform instance. The problem? It took me about three hours to set up and organize each time. How could I deliver it faster, on-demand, each time?
Objectives
I had three primary objectives:
- Repeatably be able to create the same development desktop in a region of my choice, to minimize latency.
- Use my favorite IDE and browser in the environment with my tablet.
- Destroy it in order to minimize cost when I don’t need it.
Granted, I could switch my development environment to using vim and other shell tools (which a number of people enjoy) but I’m personally more productive and comfortable using an IDE.
Create an Immutable Image
When I set up a virtual machine, I typically create an immutable image with packages and binaries pre-installed. This way, when the machine initializes, it comes up with everything I need. My pipeline to build the image often includes:
- Build the image by running scripts to install the package
- Test that the packages were installed
- Add it to my image repository.
In order to facilitate this, I opted to use CircleCI as my CI framework and Packer to build the virtual machine image. I installed the Chrome Remote Desktop binary with a bash
script and added my favorite development tools, including Visual Studio Code, Golang, and Terraform. Below is my Packer configuration, which shows that the base image is Debian and references some of my installation scripts.
{
"variables": {
"username": "{{env `TF_VAR_crd_user`}}",
"gcloud_zone": "{{env `GOOGLE_COMPUTE_ZONE`}}",
"commit_hash": "{{env `COMMIT_HASH`}}",
"project": "{{env `GOOGLE_PROJECT`}}"
},
"builders": [
{
"image_name": "chrome-remote-desktop-debian-9-{{user `commit_hash`}}",
"instance_name": "crd-test-{{uuid}}",
"image_family": "crd-debian-9",
"type": "googlecompute",
"project_id": "{{user `project`}}",
"source_image_family": "debian-9",
"source_image_project_id": "debian-cloud",
"ssh_username": "{{user `username`}}",
"zone": "{{user `gcloud_zone`}}",
"machine_type": "n1-standard-1",
"labels": {
"builder": "packer",
"owner": "{{user `username`}}"
},
"image_labels": {
"builder": "packer",
"owner": "{{user `username`}}"
}
}
],
"provisioners": [
{
"type": "shell",
"execute_command": "sudo bash -c '{{.Vars}} {{.Path}}'",
"scripts": [
"./install-desktop.sh"
]
},
{
"type": "shell",
"execute_command": "{{.Vars}} bash '{{.Path}}'",
"scripts": [
"./install-devtools.sh",
"./test.sh"
]
}
]
}
To examine the contents of the installation scripts, see this repository for the image pipeline. Packer handles the construction, test, and upload of the image to Google Cloud Platform. Typically, I would add kitchen-inspec in the pipeline to run Inspec tests against the instance but Ruby packaging can take some time to process.
In future state, I would like to add security scanning. I have not chosen a tool yet, so I appreciate a recommendation!
My favorite part about having an image pipeline is that no matter which device I’m working on, I can add new packages and configuration, push, and let the pipeline run without installing any new dependencies on my laptop. If I want to do more thorough checks of my scripts, I can create a Vagrant machine and run the same exact scripts!
Remote Desktop On-Demand
When I travelled to the UK, I wanted a remote desktop environment in Europe so I only had to carry my tablet. In order to do this, I executed the following:
- Manually go to remotedesktop.google.com/headless and copy the refresh token. (This step I couldn’t find a way to automate yet.)
- Add the refresh token and update the region I want to deploy to in my CircleCI pipeline’s environment variables.
- Run the pipeline!
By the end of the pipeline run, I can see the name of my GCP instance under “Remote Devices”. Easy right?
Terraform for Creating the Host
Automating this was not so easy. I used Terraform to retrieve the image identifier created by my image pipeline and create a GCP instance in the region of my choice. I added this to a repository for my CircleCI pipeline to use. host.tf
in the repository contains the bulk of the specification.
To execute Terraform, I use Terraform Cloud, a SaaS offering for Terraform. It stores my remote state and executes a remote plan and apply. I don’t want to set up new runners and an external state store and Terraform Cloud’s free tier handles the execution and storage for me. As a result, I don’t need to bootstrap more infrastructure for Terraform management.
Post-Provisioning on the Host
While Terraform Cloud handles remote execution and state management, it does not handle the post-provisioning steps that I needed in order for Chrome Remote Desktop to work. In order to start Chrome Remote Desktop, I needed to start the binary on the host. Problematically, the refresh token has to be manually injected and the PIN to log into the host was actually standard input. I attempted to use cloud-init
to start Chrome Remote Desktop without much luck. My fallback was SSH access. I had to ensure the user Packer specified to install all of the packages was the same as the user I would use to log into the instance, otherwise I could not apply the token and PIN. I set the user in the pipelines as the TF_VAR_crd_user
.
Next, I attempted to run a local provisioner to SSH into the instance and run the start-host
command. While this worked locally, it did not work in my pipeline. The downside was that the local provisioner I initially used required the gcloud
command line binary, which is not installed on the Terraform Cloud runners. In order to bypass all of these concerns and prevent my PIN from being compromised, I instead issued the start-host
command from CircleCI. crd.sh
in the repository contains the full script.
The public key copied above has a private key saved into CircleCI. I loaded the private key into CircleCI and added the add_ssh_keys
step to ensure the SSH key injects into the CircleCI runner.
- add_ssh_keys:
fingerprints:
- "37:b4:eb:f9:24:5f:06:c2:b1:36:4c:03:a5:a1:74:aa"
Injecting Variables
I manage some variables in CircleCI. They tend to relate to SSH access, secrets that change frequently enough to push to Terraform Cloud, and service keys. Below are the environment variables I define as part of CircleCI settings.
GCLOUD_SERVICE_KEY="json service account key"
SSH_PUBLIC_KEY="SSH public key for host"
TFCLOUD_SERVICE_KEY="API token for Terraform Cloud"
TF_VAR_crd_code="Refresh token that you got above."
TF_VAR_region="Region you want to deploy the instance to"
Since CircleCI’s environment variables don’t quite constitute secrets management, I wanted to ensure that specific secrets were located in Terraform Cloud, which is backed by Vault. The only two variables I define in the pipeline to be pushed up to Terraform Cloud are the refresh token TF_VAR_crd_code
and the region, TF_VAR_region
. In the future, I would like to use Vault or some other secrets management to inject the secrets into the host.
The Result
After the pipeline completes, I can see the remote desktop in the Chrome Remote Desktop application.
Once I log into it with my PIN, I can use the desktop as planned. One thing that I needed to do after I logged in was to disable the screensaver.
While I attempted to disable it with a command line, unfortunately the configuration only works through the desktop environment. For more instructions on how to disable the screensaver, check out the guide from Google.
Destroy the Machine
Since I only use the machine a week at a time and when I travel, I don’t want to keep a long-running instance. This allows me to minimize the cost and work in a continuous fashion, committing code in small increments and pushing often. I manage this in two ways.
First, I added a destroy
step to the pipeline that removes the instance when I manually approve it.
Second, to ensure I don’t get charged for an instance I forget about, I have a simple job in CircleCI that triggers at the beginning of the week to stop the instance. If I have some work left on it, I can restart it and push it to my repository before I destroy it. In the meantime, I don’t get charged for the instance’s hours.
Summary
If you have an interest in building your own remote desktop, continuously delivered, you can customize the Packer configuration with scripts and installation of your choice and update the CircleCI pipelines with the variables you want. I don’t intend on making this a CLI tool. The primary intent was to apply my knowledge of Continuous Delivery and its patterns to delivering a remote desktop experience that improved my own personal productivity.
Many people approached me after the talk with their own versions, their solutions ranging from running Ansible playbooks on cloud instances to Docker containers with vim plug-ins. Some even recommended the in-browser Visual Studio Code! At the end of the day, we all work on the bespoke development environment that makes us most productive. If you find yourself frustrated with repeated installation of packages for development or juggling multiple devices, there are a lot of tools to make it easier to automate your development environment and be even more productive.
References
- Setting up Chrome Remote Desktop on Compute Engine
- Image Pipeline at joatmon08/chrome-remote-desktop-image
- Desktop Deployment Pipeline at joatmon08/chrome-remote-desktop-pipeline
- Remote Desktop, Continuously Delivered at DevOpsDays Philly 2019