Particle, OpenCensus & MicroK8s

Daz Wilkin
Google Cloud - Community
7 min readJun 26, 2019

I setup a moisture sensor for a bonsai using Nick Engmann’s “Learn How to Build In-Plants, a Mesh-Connected Soil Monitoring System”. The solution publishes particle device data to ThingSpeak.

I was thinking to extend the solution to publish to one of Google’s database services but there’ something of a fine-line between application data (which this is) and measurement data (which this is). For measurements we have monitoring solutions. And, a monitoring framework that interests me is OpenCensus.

So, herewith, Particle device event data webhooked to a Google Cloud Function that creates an OpenCensus measurement from the data and ships it to the OpenCensus agent running on MicroK8s (‘cos it’s cool) and makes it available as a Prometheus (also cool) metrics endpoint and Stackdriver (‘cos I try to be a good Googler).

Follow Nick’s post to get yourself to a point where you’ve a particle.io device that’s periodically measuring and publishing moisture data. If the device is publishing data to ThingSpeak, you’re good:

OpenCensus

OpenCensus has client SDKs that are monitoring service agnostic. These ship data to an OpenCensus Agent and the agent is configured to convert and onsend data to your preferred monitoring (and trace) services. I’m focusing on monitoring and using Stackdriver and Prometheus. Here’s the agent configuration:

receivers:
opencensus:
address: ":55678"
exporters:
stackdriver:
project: "[[YOUR-PROJECT]]"
enable_tracing: false
enable_metrics: true
prometheus:
address: ":8999"
zpages:
port: 8888

You may run the Agent locally for testing but will need a Google service account with roles/monitoring.metricWriter if you wish to export data to Stackdriver. Here’s what I did:

PROJECT=[[YOUR-PROJECT]]
ROBOT=opencensus
FILE=${PWD}/${ROBOT}.json
gcloud iam service-accounts create $ROBOT \
--display-name=$ROBOT \
--project=$PROJECT
gcloud iam service-accounts keys create ${FILE} \
--iam-account=${ROBOT}@${PROJECT}.iam.gserviceaccount.com \
--project=$PROJECT
gcloud projects add-iam-policy-binding $PROJECT --member=serviceAccount:${ROBOT}@${PROJECT}.iam.gserviceaccount.com --role=roles/monitoring.metricWriter

And then:

docker run \
--interactive --tty \
--volume=$PWD/ocagent.yaml:/configs/ocagent.yaml \
--volume=$PWD/opencensus.json:/secrets/opencensus.json \
--publish=8888:8888 \
--publish=8999:8999 \
--publish=55678:55678 \
--env=GOOGLE_APPLICATION_CREDENTIALS=/secrets/opencensus.json \
omnition/opencensus-agent:0.1.8 \
--config=/configs/ocagent.yaml

While we’re at it, here’s how this configuration will be mapped to a Kubernetes configuration file:

---
apiVersion: v1
kind: ConfigMap
metadata:
name: ocagent
data:
ocagent.yaml: |
receivers:
opencensus:
address: ":55678"
exporters:
stackdriver:
project: "[[YOUR-PROJECT]]"
enable_tracing: false
enable_metrics: true
prometheus:
address: ":8999"
zpages:
port: 8888
...
---
apiVersion: apps/v1
kind: Deployment
metadata:
labels:
project: particle.io
name: ocagent
spec:
replicas: 1
selector:
matchLabels:
project: particle.io
app: agent
template:
metadata:
labels:
project: particle.io
app: agent
spec:
volumes:
- name: ocagent
configMap:
name: ocagent
- name: opencensus-key
secret:
secretName: opencensus-key
containers:
- name: ocagent
image: omnition/opencensus-agent:0.1.8
imagePullPolicy: IfNotPresent
args:
- --config=/configs/ocagent.yaml
volumeMounts:
- name: ocagent
mountPath: /configs/
- name: opencensus-key
mountPath: /var/secrets/google
env:
- name: GOOGLE_APPLICATION_CREDENTIALS
value: /var/secrets/google/opencensus.json
...
---
apiVersion: v1
kind: Service
metadata:
labels:
project: particle.io
name: ocagent
spec:
selector:
project: particle.io
app: agent
ports:
- name: zpages
port: 8888
- name: prometheus
port: 8999
- name: grpc
port: 55678
type: NodePort
...

Cloud Functions

We need a way to receive the Webhooks from Particle and transform these into OpenCensus. I’m using Google Cloud Functions and Golang. Here’s the function:

And the accompanying go.mod:

module cloudfunctiongo 1.12require (
contrib.go.opencensus.io/exporter/ocagent v0.5.0
go.opencensus.io v0.21.0
)

For local development, I recommend placing these both in a subdirectory, for consistently let’s call it “p”. In a sister subdirectory, create a “server” and in it, place main.go:

package mainimport (
"log"
"net/http"
"github.com/[[YOUR-ACCOUNT]]/webhook/p"
)
func main() {
log.Println("main")
http.HandleFunc("/", p.Percent)
log.Fatal(http.ListenAndServe(":9999", nil))
}

This just provides a convenience wrapper so that we may test our function locally. Once the OpenCensus Agent is running, start the server:

SERVICE_NAME=test \
AGENT_ENDPOINT=:55678 \
go run server/main.go

You can then throw some test data at the service:

#!/bin/bashENDPOINT="localhost:9999"
for TEST in {1..100}
do
PERCENT=$(bc <<< "scale=2; ${TEST}/100")
DATE=$(date --rfc-3339=seconds)
DATA="\
{\"event\":\"test\",\
\"data\":\"${PERCENT}\",\
\"published_at\":\"${DATE}\",\
\"coreid\":\"TEST\"}"
RESULT=$(\
curl \
--silent \
--request POST \
--header "Content-Type:application/json" \
--data "${DATA}" \
${ENDPOINT})
printf "%s [%s]\n" ${PERCENT} ${RESULT}
done

Once some data is generated, you can view zPages:

http://localhost:8888/debug/rpcz

Prometheus:

http://localhost:8999/metrics

And, Stackdriver. Use “Metrics Explorer” and you should be able to browse “OpenCensus/percent”:

Kubernetes

It’s overwrought to use Kubernetes to host the OpenCensus Agent but Kubernetes provides an elegant solution to the problem of sharing files (e.g. ocagent.yaml) with containers. For this reason, I’m using MicroK8s on a container image (link).

We want the cluster to be able to authenticate to Stackdriver. To do this, we will copy the service account key to the cluster instance, create a Kubernetes secret from it and then make the secret available (as a file in a volume mount) to the deployment. From your workstation:

gcloud compute scp \
${PWD}/opencensus.json \
${INSTANCE}: \
--project=${PROJECT} \
--zone=${ZONE}

Within ${INSTANCE}:

microk8s.kubectl create secret generic opencensus-key \
--from-file=opencensus.json=${PWD}/opencensus.json
secret/opencensus-key created

NB For simplicity I’m retaining the key’s filename (opencensus.json) within the secret. But, you can rename the file using this command too.

Then:

microk8s.kubectl apply --filename=kubernetes.yaml

We’ll need the NodePort of the gRPC endpoint of the OpenCensus Agent to configure the Cloud Function:

GRPC=$(\
microk8s.kubectl get services/ocagent \
--output=jsonpath='{.spec.ports[?(@.name=="grpc")].nodePort}') && \
echo ${GRPC}

NB If you wish, you may also determine the NodePorts for the zPages (replace GRPC with ZPGS andname=="zpages") and Prometheus (replace GRPC with PROM and name=="prometheus")

When you create the MicroK8s VM you should have an instance name (${INSTANCE}). We can use this, the ${PROJECT} and ${ZONE} to get the public IP address:

MICROK8S=$(\
gcloud compute instances describe ${INSTANCE} \
--project=${PROJECT} \
--zone=${ZONE} \
--format="value(networkInterfaces.accessConfigs[0].natIP)") && \
echo ${MICROK8S)

You’ll need to punch a hole through the firewall so the Cloud Function can access the service:

gcloud compute firewall-rules create particle-demo \
--direction=INGRESS \
--priority=1000 \
--network=default \
--action=ALLOW \
--rules=tcp:${GRPC},tcp:${ZPGS},tcp:${PROM} \
--source-ranges=0.0.0.0/0 \
--project=${PROJECT}

NB Only include the zPage and Prometheus ports if you want to use them over the external IP.

Cloud Functions

Once you’re happy with the local testing, you can deploy to Cloud Functions:

PROJECT=[[YOUR-PROJECT]]
REGION=[[YOUR-REGION]] # Perhaps 'us-central1'
gcloud services enable functions.googleapis.com \
--project=${PROJECT}
gcloud functions deploy webhook \
--region=${REGION} \
--project=${PROJECT} \
--runtime=go112 \
--trigger-http \
--entry-point=Percent \
--set-env-vars=\
SERVICE_NAME=particle.io,\
AGENT_ENDPOINT=${MICROK8S}:${GRPC}

Once deployed, the Cloud Function can best tested as before using the bash script but it’s just as quick to add a new Particle Webhook.

Webhook

Using the Particle Console:

After a few minutes, the configured Webhook will report — hopefully — the successful receipt of events, see below.

Testing

Check that the Particle device is publishing events. You may do this using the Particle Console:

Then Check that the Webhook is shipping the events to the Cloud Function:

Check the Cloud Function Console or logs:

gcloud logging read 'resource.type="cloud_function"' \
--project=${PROJECT} \
--format=json --limit=25 \
| jq -r .[].textPayload
Function execution took 3 ms, finished with status code: 200
value: 20.122101
percent
Function execution started
Function execution took 4 ms, finished with status code: 200
value: 20.415140
percent
Function execution started
Function execution took 3 ms, finished with status code: 200
value: 20.317461
percent
Function execution started
Function execution took 4 ms, finished with status code: 200
value: 20.561661
percent
Function execution started
Function execution took 8 ms, finished with status code: 200
value: 20.366301
percent
Function execution started
Function execution took 4 ms, finished with status code: 200
value: 20.317461
percent
Function execution started
Function execution took 3 ms, finished with status code: 200

Confirm that the MicroK8s service (pod) is listening:

POD=$(\
microk8s.kubectl get pods \
--selector=project=particle.io \
--output=jsonpath="{.items[0].metadata.name}")
microk8s.kubectl get logs pod/${POD}Setting Stackdriver default location to "[[YOUR-PROJECT]]"
{..."msg":"Metrics Exporter enabled","exporter":"stackdriver"}
{..."msg":"Metrics Exporter enabled","exporter":"prometheus"}
Running OpenCensus Trace and Metrics receivers as a gRPC service at ":55678"
Running zPages on port 8888

If you expose the ${ZPGS} and ${PROM} ports, you can view both of these to confirm that data is being pushed through the Agent:

google-chrome ${MICROK8S}:${ZPGS}/debug/rpcz
google-chrome ${MICROK8S}:${PROM}/metrics

I’ll leave you configure a Prometheus server to target the OpenCensus Prometheus endpoint.

Here’s a Stackdriver chart:

Conclusion

In this story, we’ve extended Nick’s solution by creating a Webhook service using Google Cloud Functions that creates OpenCensus measurements, ships these to an OpenCensus Agent deployed on MicroK8s and provides the measurements to both Stackdriver and Prometheus.

One of OpenCensus’ strengths is that it separates the concerns of the producers from the consumers. Now that the Particle data is being exposed as OpenCensus data, it’s a trivial matter of reconfiguring the Agent to expose the data to Datadog, AWS, Azure and any of the other supported monitoring services.

That’s all!

Tidy Up

The scorched earth approach that will delete *everything* including the project…

Proceed with caution…

gcloud projects delete ${PROJECT}

Alternatively, you may choose some subset of the following to delete specific resources:

gcloud functions delete webhook \
--region=us-central1 \
--project=${PROJECT}
gcloud compute instances delete ${INSTANCE} \
--zone=${ZONE} \
--project=${PROJECT}
gcloud compute firewall-rules delete particle-demo \
--project=${PROJECT}

--

--