Using single Docker repository with multiple GKE projects

Alexey Timanovskiy
Feb 10, 2017 · 3 min read

In all Google Container Engine(GKE) tutorials you see Docker repository belonging to the project: gcr.io/<you-project-name>/. Nice thing about it is that it works out of the box and does not require any setup (other than enabling GKE itself of course). However what do you do if you have multiple GKE projects which use same Docker images? For example testing, staging and production environments consuming Docker images built by CI system. Having to push to multiple repositories, and having to specify different repositories in Kubernetes YAML files is cumbersome. What we would like to have is single repository from which we could deploy to multiple projects.

Having single repository is easy — just create another dedicated Google Cloud project, say <mycompany-docker> and now you can tag you Docker images with gcr.io/<mycompany-docker>/ and push. However if you try to reference such image from another project, you will get Kubernetes error while deployment saying it can not pull the image. This is to be expected, of course GKE repositories are access protected and are available only from the same project. What we want is to provide read access to our other projects to this Docker repository. It turns out Kubernetes deployments are performed under special service account called “Compute Engine default service account”, that is the account we want to grant read permissions to.

Step 1. Get service account ID.
Go to you GKE project (one which uses Docker images) in Cloud Console, go to “IAM & Admin” tab and find “Compute Engine default service account” among service accounts. Write down account ID, which looks like email format. Repeat this for all projects which will have access to Docker repository.

Step 2. Add account to access list.
Switch to your Docker repo project and go to “IAM and Admin tab”, Select “IAM” left tab and click Add+ button at the top.

And give “Storage Object Viewer” permissions to all service accounts you recorded at step 1. Copy service account ID into “Member” field and select permissions in the menu as shown below.

That’s it. You can now refer to your central Docker repository in your Kubernetes YAML config files and access it from staging and production environments and at the same time keeping this repository private.

Google Cloud Platform - Community

A collection of technical articles published or curated by Google Cloud Platform Developer Advocates. The views expressed are those of the authors and don't necessarily reflect those of Google.

Alexey Timanovskiy

Written by

Architect www.workato.com. Past: Architect skype.com; Chief architect qik.com. https://www.linkedin.com/in/timanovsky

Google Cloud Platform - Community

A collection of technical articles published or curated by Google Cloud Platform Developer Advocates. The views expressed are those of the authors and don't necessarily reflect those of Google.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade