Cloud Functions Prometheus Exporter

Daz Wilkin
Google Cloud - Community
7 min readJan 4, 2018

--

Over the weekend, I booted my Photons, and recalled an old post in which I documented the Photons using a Telegraf Webhook endpoint to capture data displayed using Grafana.

This time, I thought it’d be interesting (though I’m not entirely sure it’s a valid|good use-case) to use Google Cloud Functions as a Prometheus Exporter that periodically interrogates the Photons and uses their variables as metrics.

I’m going to tease the remainder of the post now with some eye-candy and will flesh it out tomorrow. I will value someone’s review of the JavaScript as the Promises code became gnarly and I can sense there’s a much better way to refactor it.

Photo w/ sensor

Here’s a photo of one of the Photons with an Adfruit Si7021 temperature-humidity sensor wired in:

Here’s the code for the Photon:

Grafana

Running in a container and auto-magically querying the Prometheus targets. The upper graph reports Temperature and the lower graph reports Humidity. I have 3 Photons and each of them has a temperature-humidity sensor and reports readings as “t” and “h” respectively.

Run Grafana with:

docker run \
--detach \
--publish=3000:3000 \
grafana/grafana

And then access it via http://localhost:3000. The default username/password is admin/admin. Once (!) Prometheus is running, click “Add data source” and complete the form as below:

Grafana configured for Prometheus

When you click “Add”, you should see:

Success!

Click “Dashboards” and then click “Import” alongside “Prometheus Status”

Then return “Home” and select “Prometheus Stats”

and you should see:

Grafana w/ “Prometheus Stats” dashboard

We’ll return to Grafana later to display the Particle (Photon) metrics.

Prometheus

Running in a container and configured to monitor itself and the Cloud Function. Here the metrics are filtered by particle_gauge and only reporting the “t” (==temperature) values.

Prometheus: Graph
Prometheus: Targets

Here’s the Prometheus configuration. It monitors itself on localhost and it monitors the Cloud Functions via the generic endpoint (no “/metrics”) needed as it’s assumed (convention):

Run Prometheus assuming prometheus.yml is in the current ($PWD) directory with:

docker run \
--net=host \
--publish=9090:9090 \
--volume=$PWD/prometheus.yml:/etc/prometheus/prometheus.yml prom/prometheus

NB the “net=host” allows the Prometheus runtime to refer to itself as the host’s localhost. The container’s ports appear to be on the host’s network. This permits you to do the following *from* your host after you run the Prometheus container in order to access Prometheus’ own metrics:

curl --silent --request GET http://localhost:9090/metrics

Cloud Functions

Here’s the gnarly Promise-heavy JavaScript code that marries Google Runtime Config, Particle’s JavaScript API, and a Prometheus’ Node.js SDK into a Cloud Function:

package.json:

index.js:

Google Runtime Configurator is a feature of Google Deployment Manager but it’s really useful with Cloud Functions because it provides a way to extract configuration data. In this sample, a Runtime Configurator “config” is used to hold username and password credentials for particle.io:

PROJECT=[[YOUR-PROJECT-ID]]
USERNAME=[[YOUR-PARTICLE-USERNAME]]
PASSWORD=[[YOUR-PARTICLE-PASSWORD]]
CONFIG=particlegcloud services enable runtimeconfig.googleapis.com \
--project=${PROJECT}
gcloud beta runtime-config configs create ${CONFIG} \
--project=${PROJECT}
gcloud beta runtime-config configs variables \
set username ${USERNAME} \
--is-text \
--config-name=${CONFIG} \
--project=${PROJECT}
gcloud beta runtime-config configs variables \
set password ${PASSWORD} \
--is-text \
--config-name=${CONFIG} \
--project=${PROJECT}
Cloud Functions: “metrics”

The Cloud Function(s function!?) may be deployed with:

gcloud beta functions deploy metrics \
--local-path=. \
--trigger-http \
--entry-point=metrics \
--project=${PROJECT}

NB There’s an (un|newly-)documented feature that permits deployments directly from a local file-system without going through a staging bucket as I’m doing here.

Once the code’s deployed, if your Particle devices are online and accessible, you should be able to get your Cloud Functions (/metrics) endpoint:

curl --silent --request GET https://us-central1-${PROJECT}.cloudfunctions.net/metrics
/metrics

Container Builder

As demonstrated recently, it’s trivial to deploy Cloud Functions using Google Container Builder. This is best combined with your preferred source repository (I mostly use Google Cloud Source Repositories [CSR] because I’m a good Googler, because CSR works really well and because my code is hosted closest to the runtimes) and using Build Triggers to automate the deployment flow.

Promises are gnarly!

If you wish to use this approach, please review my other post for details. Here’s a template for the cloudbuild.yaml:

steps:
- name: gcr.io/cloud-builders/gcloud
args: [
'beta',
'functions',
'deploy','metrics',
'--source=https://source.developers.google.com/projects/${PROJECT_ID}/repos/{REPO}/moveable-aliases/master/paths/',
'--trigger-http',
'--entry-point=metrics',
'--project=${PROJECT_ID}'
]

Particle Console

The folks at Particle are doing a *fine* job with their solution. Having explored various devices and solutions, Particle is one of the easiest to understand/use and powerful. [If any of them is reading this, it would be nice to see something direct to Google Cloud IoT ;-)]

Grafana Revisited

Once the Cloud Function is feeding data to Prometheus and Grafana is configured with a Prometheus datasource, you will be able to dynamically (this is very cool) build queries against the Particle metrics.

Create a new dashboard:

New Dashboard

Choose “Graph”. Click “Panel Title” and then “Edit”

Grafana: Edit Graph

Change the “Data Source” from “default” to “Prometheus” (or whatever you named your datasource). Then start typing “particle” and you should see the metrics (counters, gauges and histograms) listed that were created by the Cloud Functions

Grafana-Prometheus integration is excellent

Let’s choose “particle_gauge” and then let’s refine the query to only select those that are labeled as Temperature (“t”). The query should look like:

particle_gauge{variable="t"}

Click the eye icon on the right of the query and then “Back to dashboard”:

Grafana reporting on the Particle variables

If you’d like “Add Row” and repeated the process for “h” for Humidity.

Conclusion

Squints… yeah, I’m still unsure whether this is a valid use-case but there’s no denying it’s a fun solution and it clearly articulates the power in tools like the Particle Cloud, Google Cloud Functions, Prometheus and Grafana to build compelling integrations with very little work.

I would very much appreciate feedback on the JavaScript code. Those Promises are gnarly and the chain of get credentials →login →get devices → get variables →resolve and reconcile is a little ‘deep’ but I feel the code could be streamlined.

Google Closure Compiler

So, I wondered whether it would be useful to ‘optimize’ my Cloud Functions code using Google Closure Compiler. You can try this for yourself using the App Engine hosted compiler (here) and paste any JavaScript (or even my index.js from above) and then see the results of “simple” and “advanced”.

This begs the question, why not add a Container Builder step to run the Closure Compiler on Cloud Functions *before* these are deployed to Cloud Functions.

I’m almost there using the JavaScript implementation and will update this post if I submit the FR. Meantime, I may create a Builder using the Java implementation.

Tidy-up

You can delete this integration easily by deleting its parts.

Delete the Prometheus and|or Grafana containers:

docker container ls --format="{{.ID}}\t\t{{ .Image}}"f95b959a4adc  grafana/grafana
6c353c1a70ab prom/prometheus
docker container stop ...
docker container rm ...

You can delete the a single Cloud Function:

gcloud beta functions delete metrics --project=$PROJECT

You can delete Runtime Config configs (and its variables) or just the variables

gcloud beta runtime-configs delete ${CONFIG} \
--project=${PROJECT}
gcloud beta runtime-configs unset username \
--config-name=${CONFIG} \
--project=${PROJECT}

You can delete the entire Google Cloud Platform project but — as always — be careful as this is irrevocable and you will delete everything in the project:

gcloud projects delete ${PROJECT} --quiet

That’s all!

--

--