Using gcloud in a Docker container
Trying to prevent installing dozens of things on my computer
I just got a fresh install of the macOS Sierra on my good and old MacBook Pro from 2011 (time flies). The reason is that since Snow Leopard I kept just updating the current system on and a developer's machine is a mess.
Long story made short: I want to try to keep the minimum amount of stuff on my system and gcloud is one thing I want to keep out. No more databases, clients, libs, brewer, languages, etc on my machine. I’ll try to keep as long as I can just using Docker to handle dependencies.
Does gcloud cli bring any trouble to the system? Not that I was aware of but I really want to try having this experience keeping as most items I can away from my system. Items like Sublime Text, Kaleidoscope, Evernote, etc will of course stay on the system.
Google keeps an image of its CLI available on Docker Hub that is kept always updated. This is the start point: to get this image.
docker pull google/cloud-sdk
We're now going to create a volume that will hold all our credentials with it. It is basically a named instance of a container that will be used as the keeper of our credentials. With that no sensitive information will be stored on our local filesystem, but if you drop that container, you'll have to do the following steps again.
One handy thing on that is that you can keep separated setups when you have to deal with multiple accounts. Of course gcloud offers also a handy way to handle it but IMHO it is best to keep things separated. More about that later.
To get our credentials set on our working container we must run:
docker run -ti --name gcloud-config google/cloud-sdk gcloud init
Notice that gcloud-config is the name of your container and it will be the name of your configuration. Thus, we could have instead something like gcloud-config-personal, gcloud-config-company-a, gcloud-config-other-project and so on, making it easy to maintain multiple setups to choose from.
Welcome! This command will take you through the configuration of gcloud.Your current configuration has been set to: [default]Network diagnostic detects and fixes local network connection issues.Checking network connection...done.Reachability Check passed.Network diagnostic (1/1 checks) passed.You must log in to continue. Would you like to log in (Y/n)? yGo to the following link in your browser:https://accounts.google.com/o/oauth2/auth?redirect_uri=urn%3Aietf%3Awg%3Aoauth%3A2.0%3Aoob&prompt=select_account&response_type=code&client_id=32555940559.apps.googleusercontent.com&scope=https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fuserinfo.email+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcloud-platform+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fappengine.admin+https%3A%2F%2Fwww.googleapis.com%2Fauth%2Fcompute&access_type=offlineEnter verification code: _
That url will after login to bring you to a page with a code that must be informed on the container in order to continue the authentication process.
After filling the code, the container will follow its setup steps, asking you to select which project you want to use as default, default region, default zone and so on.
Let's now use our gcloud authenticated container to interact with our resources at Google Cloud.
docker run --rm -ti --volumes-from gcloud-config \
google/cloud-sdk \
gcloud info
The --volumes-from
will use the filesystem created with the gcloud init
command we used. With that any further discardable container we launch using --rm
will use the gcloud-config
container we've generated.
Having many containers would make it easies to handle multiple environments in a safer way.
docker run --volumes-from gcloud-config-personal ...
docker run --volumes-from gcloud-config-company-a ...
docker run --volumes-from gcloud-config-other-project ...
It's a start. Let's see how it'll be the experience of using gcloud
inside a docker container. The first thing I had to deal with was the long command line needed to run it. I solved it with a solution as old as unix itself: alias
.