Asset Tracking with IoT on Qubida Platform

Integrate IoT devices, track location of assets and build data point history

Nuno Mourão
Nov 15, 2018 · 4 min read

The latest success story built on Qubida puts together streaming IOT Data, Realtime ETL & Dashboard for Realtime tracking of assets — an interesting and rich use case. We will explain step by step the solution details in this article.

(note: are you non techy? You can go straight to the Dashboard)

The IoT network — Sigfox and Xperanti

Sigfox is a global IoT network provider and Xperanti is the nationwide operator of Sigfox network in Malaysia. Xperanti partnered with Qubida to provide its customers an end to end platform solution with Big Data Management, Advanced Analytics and ML Driven Insights.

Asset Tracking on Qubida using IOT Data — First Use Case in Production

Step 1: Qubida Connectors for Streaming Data

Using the IOT network APIs, we created callbacks that posted the data from the IoT devices directly to Qubida.

Let’s now describe all the steps we took to put the solution in place!

Image for post
Image for post
Callback creation on Sigfox Backend Application

Creation of the POST callback on Sigfox backend. Has the objective of sending the messages directly from Sigfox network to a Kafka topic on Qubida. Each message will contain a set of information that we will explain later on this post.

Image for post
Image for post
Streams Connector — HDFS Sink on Qubida

Creation of message consumer. As messaging system Qubida uses Kafka. On Qubida we can create connectors to produce or consume messages. In this case having the Sigfox backend as message producer we needed to create a consumer, this consumer will get the data from the Kafka topic and store it somewhere, in our case into an Hive table. As a note, you might check on the image that we also have several other consumers getting data from Kafka to Elasticsearch.

Step 2: Data Preparation with Qubida Workflows

Image for post
Image for post
Data Preparation — from Hive to Elasticsearch using Spark

Create the workflow for data preparation. Why was data preparation needed then? The data messages posted by the IoT device contain three main fields: device id, timestamp and a data payload element.

The device id and timestamp can be straight away used for analysis, the real challenge of this data resides on the the payload as it comes as a 12 byte hexadecimal number.

The device id and timestamp can be straight away used for analysis, the real challenge of this data resides on the the payload as it comes as a 12 byte hexadecimal number.

There were two approaches possible to transform hexadecimal format data — either transform data at edge using Kafka SQL(ksql) supported by Qubida or use the Visual Drag & Drop interface of Qubida for Data Transformation. The workflow approach was chosen as the transformation has a periodic schedule possibility with workflows.

In this hexadecimal number we have much more information — speed, battery, heading direction, status of trip, latitude, longitude, it goes on… — this means that we need to translate this 12 bytes hexadecimal number into base 10 numbers, while we have latitude and longitude being 4 bytes each, we also have speed and battery being 1 byte each, and the trip status 1 bit only (yes, it was going back to Computer Science 101, cheekily refreshing our bit/byte level theory). Ah, and little endian/big endian didn’t stay too far behind!

Step 3: Visualize with Qubida Dashboards

The final objective for visualing the live data was to produce three main reports using Qubida Visualisation layer. One for the latest position of the asset, another containing the history of all the data points and path of the asset and a last one showing statistical and operational information.

Image for post
Image for post
Real time — last asset position

Creation of real time analysis. With the first geo map users are able to track where their assets are in real time, hovering on each asset they’ll see all the important information that is collected and prepared — speed, battery, timestamp, device id, plate number and status.

Image for post
Image for post
Path History

Creation of path history analysis. Second tab contains the path history of each device. The most useful filters on this dashboard will be the device or the timestamp, allowing the user to analyze the path of several days or only of a certain day, for one or more assets.

Image for post
Image for post
Operational / Statistical Dashboard

Creation of operational analysis. And finally the third tab of the dashboard simply shows the speed of the asset and battery of the sensor over time, again here the user can filter down by device, or even by a certain threshold on the battery and speed that they want to specifically track.

That’s it!

In Phase 2, businesses will build machine learning solutions on the platform to optimize route selection, predict vehicle breakdowns, monitor and alerts when fleet is misused.

Within few steps we were able to integrate IoT devices data with Qubida, prepare the data and visualize it, creating an advanced enterprise grade solution for real time asset tracking and sensor data management.

Qubida Analytics Blog

Enterprises build Machine Learning, AI & IOT Solutions on…

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store