Analytics Vidhya

Analytics Vidhya is a community of Generative AI and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Executing commands on-prem using GCP IoT Core devices

--

Google Cloud Platform logo. Source: Google

I stumbled on this problem during some SaaS musings. If a customer wants their on-prem devices to communicate with your SaaS running in GCP, how would you go about it?

The obvious answer for GCP is service accounts. However, you are limited to 100 service accounts per project in GCP, so that won’t scale well. So what else can we use that can authenticate large numbers of devices and works well with GCP? IoT Core of course!

IoT Core is a fully managed service (serverless) that allows you to securely connect millions of devices globally to GCP. While it’s meant to be used for small devices gathering telemetry data and feeding that back into a central location, it can be used to send commands to a device and get a response.

I decided to test it out and create a small IoT client in python that could receive a DNS name, do a lookup within a network and return the result to IoT Core. So the “command” in this case would be a DNS lookup. A simple proof of concept to see if IoT Core was up to the task. The basic architecture would look like:

Architecture diagram

Getting Set Up

To start I followed the GCP Getting Started docs for IoT Core. Once I had it set up, I followed the “Creating registries and devices” docs. I created a registry called “on-prem-devices”. It is in the “us-central1” region and uses the “MQTT” protocol. I also created a Pub/Sub topic called “on-prem-devices-telemetry” to receive the telemetry data from devices.

The next step was to add my device. To add a device you need certificates to authenticate it. So I generated the keys needed using “openssl”:

openssl req -x509 -nodes -newkey rsa:2048 \
-keyout ./key.pem \
-out crt.pem \
-days 365 \
-subj "/CN=unused"

Once I had the certificates generated I added a ‘device’ using the Web UI. The “public key value” is the contents of the file “crt.pem”

GCP “Add device” UI

Now that the device was created in GCP, I needed to write the code for the device itself. Fortunately, Google has nice docs on how to publish to MQTT that I could follow. Their example code sends telemetry at regular intervals and also deals with reauthenticating at intervals also, which is a bit beyond the scope of this. So I’ve tweaked the code to make it easier to follow. It now receives commands from IoT, does a DNS lookup and then publishes the results back to IoT. Which looks like this:

This will connect the device to GCP IoT Core and wait for commands. Once a command containing the DNS name is received it will do a DNS lookup for that name and publish the address back to IoT Core. Unlike the google example, this code won’t publish data to IoT unless you send it a command. It also doesn’t deal with reauthenticating, which would be needed for production. Despite a few missing niceties, it does prove the concept that you can send authenticated commands to a device running on a customer’s network and receive a response back, all in a secure way.

Running the code

To run the example you may need to install ‘pyjwt’ and ‘paho-mqtt’ libraries in your environment. I was using pip3 in my environment:

pip3 install pyjwt
pip3 install paho-mqtt

Then you will need to put the Python file (dns.py) in the same directory as the private key generated earlier. You will also need to download Google’s root CA file, which can be found in their docs or directly from here and place it in the same directory also. Edit the ‘project_id’ variable to contain your GCP project ID. You may also need to tweak the ‘registry_id’ and ‘device_id’ variables if your names differ. With that we can test it out:

Run ‘dns.py’ with Python 3 and it will connect to GCP IoT Core:

output of dns.py running in the CLI

Once it has connected, you can start sending commands. You can hook up a cloud function to route commands to a specific device. However, a simpler way is to just send commands in the IoT Core UI.

GCP ‘Send command’ UI

Here we send a command with the DNS name “example.com”. This will get sent down to the Python code running locally. It will receive the command and do a lookup on that DNS name. Then publish the response back to IoT core. The output from the Python code looks like:

dns.py output after receiving the command

After the Python has published the response IoT Core will send it to the Pub/Sub topic called “on-prem-devices-telemetry”. I created a Cloud Function to listen on the topic and print any data sent to it. The Cloud Function code looks like this:

exports.eventDataPubSub = (event, context) => {
console.log(`Data from: ${event.attributes.deviceId}`)
const pubsubMessage = event.data;
console.log(Buffer.from(pubsubMessage, 'base64').toString());
};

The logs for it after the Python code published its data looks like:

Cloud function logs after response is published

Conclusion

It’s possible to use IoT to authenticate devices running in a customer’s network and send commands to that device. In this example, we ran a simple python program that does DNS lookups on the local network and connected it to GCP IoT Core. IoT Core handles all the encryption and authentication for us. So we can securely send a DNS name to a customer’s network and get it’s IP address back.

While this example doesn’t have that much value in itself, we could adapt the concept to do other things. For example, you could hook the Python code to forward commands to Puppet Bolt. You can then use Puppet Bolt to execute scripts, CLI commands, and tasks on devices on the network.

--

--

Analytics Vidhya
Analytics Vidhya

Published in Analytics Vidhya

Analytics Vidhya is a community of Generative AI and Data Science professionals. We are building the next-gen data science ecosystem https://www.analyticsvidhya.com

Andrew Hayes
Andrew Hayes

Written by Andrew Hayes

Staff Software Engineer @ Harness Belfast

No responses yet