Stackdriver Error Reporting: part 1

Further adventures w/ Golang & Kubernetes Engine

Daz Wilkin
Google Cloud - Community
6 min readMar 8, 2018

--

My colleague suggested I write a blog on Stackdriver Error Reporting. He teased me with the Golang Library (yup!) and I decided to deploy to Kubernetes Engine (woop!).

So… Stackdriver Error Reporting three-ways: local, Docker and Kubernetes Engine.

Setup

I assume you’ve a Kubernetes Engine ‘baby’ cluster…. go-on, go Regional. You can run 1 node in 3 zones. I recommend taking opportunities to explore Kubernetes Engine in this compelling configuration. And, for an hour or so, the costs aren’t outrageous.

Stackdriver Error Reporting

The documentation for Go(lang) includes deploying to Kubernetes Engine https://cloud.google.com/error-reporting/docs/setup/go#kubernetes_engine

The Google-provided sample does-what-it-says-on-the-box but let’s tweak it slightly. We really only need an error so let’s create one and then periodically trigger it so that we capture a bunch of information in Stackdriver. Here’s the code:

The only changes to the Google-provided sample are line 18 using an environment variable (GOOGLE_PROJECT_ID) to represent the current project and lines 36–47 where an infinite loop triggers a go routine every 5 seconds (feel free to revise this timing) that creates (admittedly a static) error, but…

In order to use Stackdriver Error Reporting, we must enable the service in our project. Assuming you’re using the environment variable GOOGLE_PROJECT_ID to reflect your Google Cloud Platform Project ID, you may:

gcloud services enable clouderrorreporting.googleapis.com \
--project=${GOOGLE_PROJECT_ID}

As always, I recommend you employ Application Default Credentials (ADCs) to authenticate your code so that we may run it locally, Dockerized and on Kubernetes without change.

1. Local using User credentials

To run the above code locally:

GOOGLE_PROJECT_ID=[[YOUR-PROJECT-ID]]
DIR=[[YOUR-WORKING-DIRECTORY]]
mkdir -p ${HOME}/tmp/${DIR}/go
export GOPATH=${HOME}/tmp/${DIR}/go
export PATH=$PATH:$GOPATH/bin
go get -u cloud.google.com/go/errorreportingwget --output-document ${HOME}/tmp/${DIR}/go/src/main.go https://gist.githubusercontent.com/DazWilkin/fb372dff70fdcfc34179ef83383f67ff/raw/6c7daf9c5c1f7385bc88940cc5135cec086bb4d9/main.gogo run ${HOME}/tmp/${DIR}/go/src/main.go

As long as you’ve authenticated using ADCs and you’ve correctly set GOOGLE_CLOUD_PROJECT, you should see:

2018/03/07 16:23:09 [Error] pod: []; node: []; ip: []
2018/03/07 16:23:14 [Error] pod: []; node: []; ip: []
2018/03/07 16:23:19 [Error] pod: []; node: []; ip: []

If you want to be more diligent, we can assign values to the pod, node and ip with the following:

POD_NAME="N" POD_NODE="X" POD_IP="0.0.0.0" go run main.go

And you should then see:

2018/03/07 16:25:23 [Error] pod: [N]; node: [X]; ip: [0.0.0.0]
2018/03/07 16:25:28 [Error] pod: [N]; node: [X]; ip: [0.0.0.0]
2018/03/07 16:25:33 [Error] pod: [N]; node: [X]; ip: [0.0.0.0]

All good thus far!

2. Local using Service Account credentials

It’s generally preferable to use service account credentials with ADCs rather than (what we are doing above) and employing user credentials. To use a service account we must first create one, generate a key for it and then use IAM to assign it appropriate permissions. All the above may be achieved with the following:

ROBOT=[[YOUR-ROBOT-NAME]]   # gke-error-reporting
EMAIL=${GOOGLE_PROJECT_ID}.iam.gserviceaccount.com
gcloud iam service-accounts create $ROBOT \
--display-name=$ROBOT \
--project=${GOOGLE_PROJECT_ID}
gcloud iam service-accounts keys create ./key.json \
--iam-account=${ROBOT}@${EMAIL} \
--project=${GOOGLE_PROJECT_ID}
gcloud projects add-iam-policy-binding ${GOOGLE_PROJECT_ID} \
--member=serviceAccount:${ROBOT}@${EMAIL} \
--role=roles/errorreporting.user
gcloud projects add-iam-policy-binding ${GOOGLE_PROJECT_ID} \
--member=serviceAccount:${ROBOT}@${EMAIL} \
--role=roles/errorreporting.writer

NB I’m going to assume hereafter that you either cd ${HOME}/tmp/${DIR}/go/src or you’ll just create the ensuring files in that directory for simplicity.

To use the service account instead of your using credentials,

GOOGLE_APPLICATION_CREDENTIALS=./key.json

and rerun the Golang:

go run main.go

That’s it!

3. Docker using Service Account Credentials

Here’s a Dockerfile:

One (of many) awesome features of Golang is that the tooling makes it trivial to generate succinct, static binaries (and to cross-compile for your Raspberry Pis, Omega2s etc. — more later). To save a few hundred MBs, I like to build a static binary and use the SCRATCH image. To use this, you need to do something like the following for amd64:

CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o main .

You also need to copy ca-certificates.crt to your working directory and this file is usually found in /etc/ssl/certs so:

cp /etc/ssl/certs/ca-certificates.crt ${HOME}/tmp/${DIR}/go/src

You may now build the Docker image and let’s use my favorite Google Container Registry (GCR) to host it for us (close to Kubernetes Engine):

CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o main .docker build --tag=gcr.io/${GOOGLE_PROJECT_ID}/go-errrep .
gcloud docker -- push gcr.io/${GOOGLE_PROJECT_ID}/go-errrep

Alright, we can now run the sample using Docker and using the Service Account. In the command below, I’m spoofing a value of POD_NAME to confirm that the error includes this:

docker run \
--env=GOOGLE_APPLICATION_CREDENTIALS=/key.json \
--env=GOOGLE_PROJECT_ID=${GOOGLE_PROJECT_ID} \
--env=POD_NAME=hoops \
--volume=$PWD/key.json:/key.json \
gcr.io/${PROJECT}/go-errrep

and, as before, you should expect to see something akin to:

2018/03/08 01:01:49 [Error] pod: [hoops]; node: []; ip: []
2018/03/08 01:01:54 [Error] pod: [hoops]; node: []; ip: []
2018/03/08 01:01:59 [Error] pod: [hoops]; node: []; ip: []

4. King Kubernetes

Alright… we’ve a demonstrably working Docker image. Hopefully, you have a cluster purring in the cloud ready for some work. We’ve a Service Account that has appropriate authentication and, you’ll recall we enabled the Stackdriver Error Reporting service. Let’s deploy to Kubernetes.

There’s an elegant solution to make Service Accounts accessible to Pods running on Kubernetes. We must first upload a key for the Service Account to the Kubernetes cluster as a secret:

kubectl create secret generic error-reporting-key \
--from-file=key.json=${HOME}/tmp/${DIR}/go/src/key.json

Then, for our Deployment, will we will map this key in the secret into the Pod as a file. This is analogous to the --volume=$PWD/key.json:/key.json step that we performed when we ran the solution locally under Docker.

Here’s a Deployment file for you to use:

NB Please replace the values ${GOOGLE_PROJECT_ID} in lines 19 and 27 with the actual value of your Google Cloud Platform Project ID.

Lines 13–16 create a volume called google-cloud-key that includes the key.json file converted into a Kubernetes Secret called error-reporting-key. This volume is then mounted into the Pod in lines 20–22 under /var/secrets/google. The file is then referenced explicitly through an environment variable called (as you’d expect) GOOGLE_APPLCIATION_CREDENTIALS. You will recall this is the environment variable we used when we referenced the key when using Docker locally.

That’s quite a dance but it’s a powerful solution to the problem or representing Service Account keys to Kubernetes Pods securely.

While we’re down here, lines 28–39 utilize Kubernetes’ Downward API to provide details to containers in our Pods about their configuration. You may feel this is a round-about way to achieve this but it represents a best-practice that, as you saw when we ran the container locally, permits us to keep the container decoupled from Kubernetes.

Let’s deploy this to your cluster:

kubectl apply --filename=deployment.yaml

And, if you’re using the Kubernetes UI:

Deployment “error-reporting”

And, if you pick the pod and check its logs:

Pod “error-reporting” logs

Google Cloud Console provides an increasingly comparable UI for Kubernetes Engine:

Cloud Console “error-reporting”

But, you may be asking….. so what? What did all this Stackdriver Error Reporting goodness really provide?

Return to Stackdriver Error Reporting

We get this:

https://console.cloud.google.com/errors?project=${GOOGLE_PROJECT_ID}

Stackdriver Error Reporting

And this:

and opening one of the samples:

And we may be tempted to click that link to main.go in the expanded sample and find:

Conclusion

Stackdriver Error Reporting is a powerful service. Hopefully this post has helped you understand how you may access Error Reporting from Golang applications. To demonstrate this, we created a trivial Golang sample that repeatedly generates errors. We deployed this sample (1) locally; (2) locally under Docker; (3) using Kubernetes Engine.

For completeness, a bonus 4th way is that it’s possible to reference your User credentials (instead of a Service Account) when running under Docker locally:

docker run \
--env=GOOGLE_PROJECT_ID=${GOOGLE_PROJECT_ID} \
--volume=${HOME}/.config/gcloud:/.config/gcloud \
gcr.io/${GOOGLE_PROJECT_ID}/go-errrep

Feedback always welcome!

That’s it.

--

--